00:00:00.000 Started by upstream project "autotest-per-patch" build number 127130 00:00:00.000 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.013 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.013 The recommended git tool is: git 00:00:00.013 using credential 00000000-0000-0000-0000-000000000002 00:00:00.015 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.026 Fetching changes from the remote Git repository 00:00:00.027 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.041 Using shallow fetch with depth 1 00:00:00.041 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.041 > git --version # timeout=10 00:00:00.058 > git --version # 'git version 2.39.2' 00:00:00.058 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.084 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.085 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.671 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.681 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.693 Checking out Revision f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 (FETCH_HEAD) 00:00:04.693 > git config core.sparsecheckout # timeout=10 00:00:04.704 > git read-tree -mu HEAD # timeout=10 00:00:04.719 > git checkout -f f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=5 00:00:04.753 Commit message: "spdk-abi-per-patch: fix check-so-deps-docker-autotest parameters" 00:00:04.753 > git rev-list --no-walk f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=10 00:00:04.864 [Pipeline] Start of Pipeline 00:00:04.875 [Pipeline] library 00:00:04.876 Loading library shm_lib@master 00:00:04.876 Library shm_lib@master is cached. Copying from home. 00:00:04.891 [Pipeline] node 00:00:04.902 Running on WFP19 in /var/jenkins/workspace/crypto-phy-autotest 00:00:04.904 [Pipeline] { 00:00:04.912 [Pipeline] catchError 00:00:04.913 [Pipeline] { 00:00:04.922 [Pipeline] wrap 00:00:04.929 [Pipeline] { 00:00:04.934 [Pipeline] stage 00:00:04.935 [Pipeline] { (Prologue) 00:00:05.099 [Pipeline] sh 00:00:05.382 + logger -p user.info -t JENKINS-CI 00:00:05.397 [Pipeline] echo 00:00:05.399 Node: WFP19 00:00:05.404 [Pipeline] sh 00:00:05.696 [Pipeline] setCustomBuildProperty 00:00:05.709 [Pipeline] echo 00:00:05.710 Cleanup processes 00:00:05.716 [Pipeline] sh 00:00:05.997 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:05.997 1398153 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.009 [Pipeline] sh 00:00:06.290 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.290 ++ grep -v 'sudo pgrep' 00:00:06.290 ++ awk '{print $1}' 00:00:06.290 + sudo kill -9 00:00:06.290 + true 00:00:06.303 [Pipeline] cleanWs 00:00:06.312 [WS-CLEANUP] Deleting project workspace... 00:00:06.312 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.318 [WS-CLEANUP] done 00:00:06.322 [Pipeline] setCustomBuildProperty 00:00:06.335 [Pipeline] sh 00:00:06.611 + sudo git config --global --replace-all safe.directory '*' 00:00:06.671 [Pipeline] httpRequest 00:00:06.694 [Pipeline] echo 00:00:06.695 Sorcerer 10.211.164.101 is alive 00:00:06.701 [Pipeline] httpRequest 00:00:06.704 HttpMethod: GET 00:00:06.705 URL: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:06.705 Sending request to url: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:06.716 Response Code: HTTP/1.1 200 OK 00:00:06.717 Success: Status code 200 is in the accepted range: 200,404 00:00:06.717 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:08.156 [Pipeline] sh 00:00:08.439 + tar --no-same-owner -xf jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:08.456 [Pipeline] httpRequest 00:00:08.482 [Pipeline] echo 00:00:08.484 Sorcerer 10.211.164.101 is alive 00:00:08.492 [Pipeline] httpRequest 00:00:08.496 HttpMethod: GET 00:00:08.497 URL: http://10.211.164.101/packages/spdk_e5ef9abc9ee9c86a9ff61108fb262630413e40ec.tar.gz 00:00:08.497 Sending request to url: http://10.211.164.101/packages/spdk_e5ef9abc9ee9c86a9ff61108fb262630413e40ec.tar.gz 00:00:08.519 Response Code: HTTP/1.1 200 OK 00:00:08.519 Success: Status code 200 is in the accepted range: 200,404 00:00:08.520 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_e5ef9abc9ee9c86a9ff61108fb262630413e40ec.tar.gz 00:02:07.723 [Pipeline] sh 00:02:08.015 + tar --no-same-owner -xf spdk_e5ef9abc9ee9c86a9ff61108fb262630413e40ec.tar.gz 00:02:11.352 [Pipeline] sh 00:02:11.637 + git -C spdk log --oneline -n5 00:02:11.637 e5ef9abc9 test/scheduler: Add a system level test for the scheduler_set_option RPC 00:02:11.637 223450b47 lib/event: Add support for core isolation in scheduling 00:02:11.637 6a0934c18 lib/event: Modify spdk_reactor_set_interrupt_mode() to be called from scheduling reactor 00:02:11.637 d005e023b raid: fix empty slot not updated in sb after resize 00:02:11.637 f41dbc235 nvme: always specify CC_CSS_NVM when CAP_CSS_IOCS is not set 00:02:11.649 [Pipeline] } 00:02:11.666 [Pipeline] // stage 00:02:11.675 [Pipeline] stage 00:02:11.678 [Pipeline] { (Prepare) 00:02:11.697 [Pipeline] writeFile 00:02:11.714 [Pipeline] sh 00:02:11.999 + logger -p user.info -t JENKINS-CI 00:02:12.012 [Pipeline] sh 00:02:12.297 + logger -p user.info -t JENKINS-CI 00:02:12.311 [Pipeline] sh 00:02:12.597 + cat autorun-spdk.conf 00:02:12.597 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:12.597 SPDK_TEST_BLOCKDEV=1 00:02:12.597 SPDK_TEST_ISAL=1 00:02:12.597 SPDK_TEST_CRYPTO=1 00:02:12.597 SPDK_TEST_REDUCE=1 00:02:12.597 SPDK_TEST_VBDEV_COMPRESS=1 00:02:12.597 SPDK_RUN_UBSAN=1 00:02:12.597 SPDK_TEST_ACCEL=1 00:02:12.604 RUN_NIGHTLY=0 00:02:12.611 [Pipeline] readFile 00:02:12.643 [Pipeline] withEnv 00:02:12.645 [Pipeline] { 00:02:12.662 [Pipeline] sh 00:02:12.951 + set -ex 00:02:12.951 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:02:12.951 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:12.951 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:12.951 ++ SPDK_TEST_BLOCKDEV=1 00:02:12.951 ++ SPDK_TEST_ISAL=1 00:02:12.951 ++ SPDK_TEST_CRYPTO=1 00:02:12.951 ++ SPDK_TEST_REDUCE=1 00:02:12.951 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:12.951 ++ SPDK_RUN_UBSAN=1 00:02:12.951 ++ SPDK_TEST_ACCEL=1 00:02:12.951 ++ RUN_NIGHTLY=0 00:02:12.951 + case $SPDK_TEST_NVMF_NICS in 00:02:12.951 + DRIVERS= 00:02:12.951 + [[ -n '' ]] 00:02:12.951 + exit 0 00:02:12.961 [Pipeline] } 00:02:12.982 [Pipeline] // withEnv 00:02:12.989 [Pipeline] } 00:02:13.007 [Pipeline] // stage 00:02:13.018 [Pipeline] catchError 00:02:13.020 [Pipeline] { 00:02:13.036 [Pipeline] timeout 00:02:13.036 Timeout set to expire in 1 hr 0 min 00:02:13.039 [Pipeline] { 00:02:13.056 [Pipeline] stage 00:02:13.058 [Pipeline] { (Tests) 00:02:13.073 [Pipeline] sh 00:02:13.357 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:02:13.358 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:02:13.358 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:02:13.358 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:02:13.358 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:13.358 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:02:13.358 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:02:13.358 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:02:13.358 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:02:13.358 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:02:13.358 + [[ crypto-phy-autotest == pkgdep-* ]] 00:02:13.358 + cd /var/jenkins/workspace/crypto-phy-autotest 00:02:13.358 + source /etc/os-release 00:02:13.358 ++ NAME='Fedora Linux' 00:02:13.358 ++ VERSION='38 (Cloud Edition)' 00:02:13.358 ++ ID=fedora 00:02:13.358 ++ VERSION_ID=38 00:02:13.358 ++ VERSION_CODENAME= 00:02:13.358 ++ PLATFORM_ID=platform:f38 00:02:13.358 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:02:13.358 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:13.358 ++ LOGO=fedora-logo-icon 00:02:13.358 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:02:13.358 ++ HOME_URL=https://fedoraproject.org/ 00:02:13.358 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:02:13.358 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:13.358 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:13.358 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:13.358 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:02:13.358 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:13.358 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:02:13.358 ++ SUPPORT_END=2024-05-14 00:02:13.358 ++ VARIANT='Cloud Edition' 00:02:13.358 ++ VARIANT_ID=cloud 00:02:13.358 + uname -a 00:02:13.358 Linux spdk-wfp-19 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:02:13.358 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:02:17.556 Hugepages 00:02:17.556 node hugesize free / total 00:02:17.556 node0 1048576kB 0 / 0 00:02:17.556 node0 2048kB 0 / 0 00:02:17.556 node1 1048576kB 0 / 0 00:02:17.556 node1 2048kB 0 / 0 00:02:17.556 00:02:17.556 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:17.556 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:17.556 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:17.556 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:17.556 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:17.556 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:17.556 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:17.557 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:17.557 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:17.557 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:17.557 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:17.557 + rm -f /tmp/spdk-ld-path 00:02:17.557 + source autorun-spdk.conf 00:02:17.557 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:17.557 ++ SPDK_TEST_BLOCKDEV=1 00:02:17.557 ++ SPDK_TEST_ISAL=1 00:02:17.557 ++ SPDK_TEST_CRYPTO=1 00:02:17.557 ++ SPDK_TEST_REDUCE=1 00:02:17.557 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:17.557 ++ SPDK_RUN_UBSAN=1 00:02:17.557 ++ SPDK_TEST_ACCEL=1 00:02:17.557 ++ RUN_NIGHTLY=0 00:02:17.557 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:17.557 + [[ -n '' ]] 00:02:17.557 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:17.557 + for M in /var/spdk/build-*-manifest.txt 00:02:17.557 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:17.557 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:17.557 + for M in /var/spdk/build-*-manifest.txt 00:02:17.557 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:17.557 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:17.557 ++ uname 00:02:17.557 + [[ Linux == \L\i\n\u\x ]] 00:02:17.557 + sudo dmesg -T 00:02:17.557 + sudo dmesg --clear 00:02:17.557 + dmesg_pid=1399777 00:02:17.557 + sudo dmesg -Tw 00:02:17.557 + [[ Fedora Linux == FreeBSD ]] 00:02:17.557 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:17.557 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:17.557 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:17.557 + [[ -x /usr/src/fio-static/fio ]] 00:02:17.557 + export FIO_BIN=/usr/src/fio-static/fio 00:02:17.557 + FIO_BIN=/usr/src/fio-static/fio 00:02:17.557 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:17.557 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:17.557 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:17.557 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:17.557 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:17.557 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:17.557 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:17.557 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:17.557 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:17.557 Test configuration: 00:02:17.557 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:17.557 SPDK_TEST_BLOCKDEV=1 00:02:17.557 SPDK_TEST_ISAL=1 00:02:17.557 SPDK_TEST_CRYPTO=1 00:02:17.557 SPDK_TEST_REDUCE=1 00:02:17.557 SPDK_TEST_VBDEV_COMPRESS=1 00:02:17.557 SPDK_RUN_UBSAN=1 00:02:17.557 SPDK_TEST_ACCEL=1 00:02:17.557 RUN_NIGHTLY=0 07:07:49 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:17.557 07:07:49 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:17.557 07:07:49 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:17.557 07:07:49 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:17.557 07:07:49 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.557 07:07:49 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.557 07:07:49 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.557 07:07:49 -- paths/export.sh@5 -- $ export PATH 00:02:17.557 07:07:49 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.557 07:07:49 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:17.557 07:07:49 -- common/autobuild_common.sh@447 -- $ date +%s 00:02:17.557 07:07:49 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721884069.XXXXXX 00:02:17.557 07:07:49 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721884069.8qRIUU 00:02:17.557 07:07:49 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:02:17.557 07:07:49 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:02:17.557 07:07:49 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:02:17.557 07:07:49 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:17.557 07:07:49 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:17.557 07:07:49 -- common/autobuild_common.sh@463 -- $ get_config_params 00:02:17.557 07:07:49 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:02:17.557 07:07:49 -- common/autotest_common.sh@10 -- $ set +x 00:02:17.557 07:07:49 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:02:17.557 07:07:49 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:02:17.557 07:07:49 -- pm/common@17 -- $ local monitor 00:02:17.557 07:07:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.557 07:07:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.557 07:07:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.557 07:07:49 -- pm/common@21 -- $ date +%s 00:02:17.557 07:07:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.557 07:07:49 -- pm/common@21 -- $ date +%s 00:02:17.557 07:07:49 -- pm/common@25 -- $ sleep 1 00:02:17.557 07:07:49 -- pm/common@21 -- $ date +%s 00:02:17.557 07:07:49 -- pm/common@21 -- $ date +%s 00:02:17.557 07:07:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721884069 00:02:17.557 07:07:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721884069 00:02:17.557 07:07:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721884069 00:02:17.557 07:07:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721884069 00:02:17.557 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721884069_collect-vmstat.pm.log 00:02:17.557 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721884069_collect-cpu-load.pm.log 00:02:17.557 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721884069_collect-cpu-temp.pm.log 00:02:17.557 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721884069_collect-bmc-pm.bmc.pm.log 00:02:18.497 07:07:50 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:02:18.497 07:07:50 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:18.497 07:07:50 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:18.497 07:07:50 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:18.497 07:07:50 -- spdk/autobuild.sh@16 -- $ date -u 00:02:18.497 Thu Jul 25 05:07:50 AM UTC 2024 00:02:18.497 07:07:50 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:18.497 v24.09-pre-321-ge5ef9abc9 00:02:18.497 07:07:50 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:18.497 07:07:50 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:18.497 07:07:50 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:18.497 07:07:50 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:02:18.497 07:07:50 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:18.497 07:07:50 -- common/autotest_common.sh@10 -- $ set +x 00:02:18.497 ************************************ 00:02:18.497 START TEST ubsan 00:02:18.497 ************************************ 00:02:18.497 07:07:50 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:02:18.497 using ubsan 00:02:18.497 00:02:18.497 real 0m0.001s 00:02:18.497 user 0m0.000s 00:02:18.497 sys 0m0.000s 00:02:18.497 07:07:50 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:18.497 07:07:50 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:18.497 ************************************ 00:02:18.497 END TEST ubsan 00:02:18.497 ************************************ 00:02:18.497 07:07:51 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:02:18.497 07:07:51 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:18.497 07:07:51 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:18.497 07:07:51 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:02:18.497 07:07:51 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:18.497 07:07:51 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:18.497 07:07:51 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:18.497 07:07:51 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:02:18.497 07:07:51 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:02:18.756 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:02:18.756 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:19.032 Using 'verbs' RDMA provider 00:02:35.298 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:50.189 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:50.189 Creating mk/config.mk...done. 00:02:50.189 Creating mk/cc.flags.mk...done. 00:02:50.189 Type 'make' to build. 00:02:50.189 07:08:21 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:50.190 07:08:21 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:02:50.190 07:08:21 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:50.190 07:08:21 -- common/autotest_common.sh@10 -- $ set +x 00:02:50.190 ************************************ 00:02:50.190 START TEST make 00:02:50.190 ************************************ 00:02:50.190 07:08:22 make -- common/autotest_common.sh@1125 -- $ make -j112 00:02:50.190 make[1]: Nothing to be done for 'all'. 00:03:28.945 The Meson build system 00:03:28.945 Version: 1.3.1 00:03:28.945 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:03:28.945 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:03:28.945 Build type: native build 00:03:28.945 Program cat found: YES (/usr/bin/cat) 00:03:28.945 Project name: DPDK 00:03:28.945 Project version: 24.03.0 00:03:28.945 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:03:28.945 C linker for the host machine: cc ld.bfd 2.39-16 00:03:28.945 Host machine cpu family: x86_64 00:03:28.945 Host machine cpu: x86_64 00:03:28.945 Message: ## Building in Developer Mode ## 00:03:28.945 Program pkg-config found: YES (/usr/bin/pkg-config) 00:03:28.945 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:03:28.945 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:03:28.945 Program python3 found: YES (/usr/bin/python3) 00:03:28.945 Program cat found: YES (/usr/bin/cat) 00:03:28.945 Compiler for C supports arguments -march=native: YES 00:03:28.945 Checking for size of "void *" : 8 00:03:28.945 Checking for size of "void *" : 8 (cached) 00:03:28.945 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:03:28.945 Library m found: YES 00:03:28.945 Library numa found: YES 00:03:28.945 Has header "numaif.h" : YES 00:03:28.945 Library fdt found: NO 00:03:28.945 Library execinfo found: NO 00:03:28.945 Has header "execinfo.h" : YES 00:03:28.945 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:28.945 Run-time dependency libarchive found: NO (tried pkgconfig) 00:03:28.945 Run-time dependency libbsd found: NO (tried pkgconfig) 00:03:28.945 Run-time dependency jansson found: NO (tried pkgconfig) 00:03:28.945 Run-time dependency openssl found: YES 3.0.9 00:03:28.945 Run-time dependency libpcap found: YES 1.10.4 00:03:28.945 Has header "pcap.h" with dependency libpcap: YES 00:03:28.945 Compiler for C supports arguments -Wcast-qual: YES 00:03:28.945 Compiler for C supports arguments -Wdeprecated: YES 00:03:28.945 Compiler for C supports arguments -Wformat: YES 00:03:28.945 Compiler for C supports arguments -Wformat-nonliteral: NO 00:03:28.945 Compiler for C supports arguments -Wformat-security: NO 00:03:28.945 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:28.945 Compiler for C supports arguments -Wmissing-prototypes: YES 00:03:28.945 Compiler for C supports arguments -Wnested-externs: YES 00:03:28.945 Compiler for C supports arguments -Wold-style-definition: YES 00:03:28.945 Compiler for C supports arguments -Wpointer-arith: YES 00:03:28.945 Compiler for C supports arguments -Wsign-compare: YES 00:03:28.945 Compiler for C supports arguments -Wstrict-prototypes: YES 00:03:28.945 Compiler for C supports arguments -Wundef: YES 00:03:28.945 Compiler for C supports arguments -Wwrite-strings: YES 00:03:28.945 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:03:28.945 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:03:28.945 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:28.945 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:03:28.945 Program objdump found: YES (/usr/bin/objdump) 00:03:28.945 Compiler for C supports arguments -mavx512f: YES 00:03:28.945 Checking if "AVX512 checking" compiles: YES 00:03:28.945 Fetching value of define "__SSE4_2__" : 1 00:03:28.945 Fetching value of define "__AES__" : 1 00:03:28.945 Fetching value of define "__AVX__" : 1 00:03:28.945 Fetching value of define "__AVX2__" : 1 00:03:28.945 Fetching value of define "__AVX512BW__" : 1 00:03:28.945 Fetching value of define "__AVX512CD__" : 1 00:03:28.945 Fetching value of define "__AVX512DQ__" : 1 00:03:28.945 Fetching value of define "__AVX512F__" : 1 00:03:28.945 Fetching value of define "__AVX512VL__" : 1 00:03:28.945 Fetching value of define "__PCLMUL__" : 1 00:03:28.945 Fetching value of define "__RDRND__" : 1 00:03:28.945 Fetching value of define "__RDSEED__" : 1 00:03:28.945 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:03:28.945 Fetching value of define "__znver1__" : (undefined) 00:03:28.945 Fetching value of define "__znver2__" : (undefined) 00:03:28.945 Fetching value of define "__znver3__" : (undefined) 00:03:28.945 Fetching value of define "__znver4__" : (undefined) 00:03:28.945 Compiler for C supports arguments -Wno-format-truncation: YES 00:03:28.945 Message: lib/log: Defining dependency "log" 00:03:28.946 Message: lib/kvargs: Defining dependency "kvargs" 00:03:28.946 Message: lib/telemetry: Defining dependency "telemetry" 00:03:28.946 Checking for function "getentropy" : NO 00:03:28.946 Message: lib/eal: Defining dependency "eal" 00:03:28.946 Message: lib/ring: Defining dependency "ring" 00:03:28.946 Message: lib/rcu: Defining dependency "rcu" 00:03:28.946 Message: lib/mempool: Defining dependency "mempool" 00:03:28.946 Message: lib/mbuf: Defining dependency "mbuf" 00:03:28.946 Fetching value of define "__PCLMUL__" : 1 (cached) 00:03:28.946 Fetching value of define "__AVX512F__" : 1 (cached) 00:03:28.946 Fetching value of define "__AVX512BW__" : 1 (cached) 00:03:28.946 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:03:28.946 Fetching value of define "__AVX512VL__" : 1 (cached) 00:03:28.946 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:03:28.946 Compiler for C supports arguments -mpclmul: YES 00:03:28.946 Compiler for C supports arguments -maes: YES 00:03:28.946 Compiler for C supports arguments -mavx512f: YES (cached) 00:03:28.946 Compiler for C supports arguments -mavx512bw: YES 00:03:28.946 Compiler for C supports arguments -mavx512dq: YES 00:03:28.946 Compiler for C supports arguments -mavx512vl: YES 00:03:28.946 Compiler for C supports arguments -mvpclmulqdq: YES 00:03:28.946 Compiler for C supports arguments -mavx2: YES 00:03:28.946 Compiler for C supports arguments -mavx: YES 00:03:28.946 Message: lib/net: Defining dependency "net" 00:03:28.946 Message: lib/meter: Defining dependency "meter" 00:03:28.946 Message: lib/ethdev: Defining dependency "ethdev" 00:03:28.946 Message: lib/pci: Defining dependency "pci" 00:03:28.946 Message: lib/cmdline: Defining dependency "cmdline" 00:03:28.946 Message: lib/hash: Defining dependency "hash" 00:03:28.946 Message: lib/timer: Defining dependency "timer" 00:03:28.946 Message: lib/compressdev: Defining dependency "compressdev" 00:03:28.946 Message: lib/cryptodev: Defining dependency "cryptodev" 00:03:28.946 Message: lib/dmadev: Defining dependency "dmadev" 00:03:28.946 Compiler for C supports arguments -Wno-cast-qual: YES 00:03:28.946 Message: lib/power: Defining dependency "power" 00:03:28.946 Message: lib/reorder: Defining dependency "reorder" 00:03:28.946 Message: lib/security: Defining dependency "security" 00:03:28.946 Has header "linux/userfaultfd.h" : YES 00:03:28.946 Has header "linux/vduse.h" : YES 00:03:28.946 Message: lib/vhost: Defining dependency "vhost" 00:03:28.946 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:03:28.946 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:03:28.946 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:03:28.946 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:03:28.946 Compiler for C supports arguments -std=c11: YES 00:03:28.946 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:03:28.946 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:03:28.946 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:03:28.946 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:03:28.946 Run-time dependency libmlx5 found: YES 1.24.44.0 00:03:28.946 Run-time dependency libibverbs found: YES 1.14.44.0 00:03:28.946 Library mtcr_ul found: NO 00:03:28.946 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:03:28.946 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:03:28.946 Configuring mlx5_autoconf.h using configuration 00:03:28.946 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:03:28.946 Run-time dependency libcrypto found: YES 3.0.9 00:03:28.946 Library IPSec_MB found: YES 00:03:28.946 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:03:28.946 Message: drivers/common/qat: Defining dependency "common_qat" 00:03:28.946 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:03:28.946 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:03:28.946 Library IPSec_MB found: YES 00:03:28.946 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:03:28.947 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:03:28.947 Compiler for C supports arguments -std=c11: YES (cached) 00:03:28.947 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:28.947 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:28.947 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:28.947 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:28.947 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:03:28.947 Run-time dependency libisal found: NO (tried pkgconfig) 00:03:28.947 Library libisal found: NO 00:03:28.947 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:03:28.947 Compiler for C supports arguments -std=c11: YES (cached) 00:03:28.947 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:28.947 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:28.947 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:28.947 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:28.947 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:03:28.947 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:03:28.947 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:03:28.947 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:03:28.947 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:03:28.947 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:03:28.947 Program doxygen found: YES (/usr/bin/doxygen) 00:03:28.947 Configuring doxy-api-html.conf using configuration 00:03:28.947 Configuring doxy-api-man.conf using configuration 00:03:28.947 Program mandb found: YES (/usr/bin/mandb) 00:03:28.947 Program sphinx-build found: NO 00:03:28.947 Configuring rte_build_config.h using configuration 00:03:28.947 Message: 00:03:28.947 ================= 00:03:28.947 Applications Enabled 00:03:28.947 ================= 00:03:28.947 00:03:28.947 apps: 00:03:28.947 00:03:28.947 00:03:28.947 Message: 00:03:28.947 ================= 00:03:28.947 Libraries Enabled 00:03:28.947 ================= 00:03:28.947 00:03:28.947 libs: 00:03:28.947 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:03:28.947 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:03:28.947 cryptodev, dmadev, power, reorder, security, vhost, 00:03:28.947 00:03:28.947 Message: 00:03:28.947 =============== 00:03:28.947 Drivers Enabled 00:03:28.947 =============== 00:03:28.947 00:03:28.947 common: 00:03:28.947 mlx5, qat, 00:03:28.947 bus: 00:03:28.947 auxiliary, pci, vdev, 00:03:28.947 mempool: 00:03:28.947 ring, 00:03:28.947 dma: 00:03:28.947 00:03:28.947 net: 00:03:28.947 00:03:28.947 crypto: 00:03:28.947 ipsec_mb, mlx5, 00:03:28.947 compress: 00:03:28.947 isal, mlx5, 00:03:28.947 vdpa: 00:03:28.947 00:03:28.947 00:03:28.947 Message: 00:03:28.947 ================= 00:03:28.947 Content Skipped 00:03:28.947 ================= 00:03:28.947 00:03:28.947 apps: 00:03:28.947 dumpcap: explicitly disabled via build config 00:03:28.947 graph: explicitly disabled via build config 00:03:28.947 pdump: explicitly disabled via build config 00:03:28.947 proc-info: explicitly disabled via build config 00:03:28.947 test-acl: explicitly disabled via build config 00:03:28.947 test-bbdev: explicitly disabled via build config 00:03:28.947 test-cmdline: explicitly disabled via build config 00:03:28.947 test-compress-perf: explicitly disabled via build config 00:03:28.947 test-crypto-perf: explicitly disabled via build config 00:03:28.947 test-dma-perf: explicitly disabled via build config 00:03:28.947 test-eventdev: explicitly disabled via build config 00:03:28.947 test-fib: explicitly disabled via build config 00:03:28.947 test-flow-perf: explicitly disabled via build config 00:03:28.947 test-gpudev: explicitly disabled via build config 00:03:28.947 test-mldev: explicitly disabled via build config 00:03:28.947 test-pipeline: explicitly disabled via build config 00:03:28.947 test-pmd: explicitly disabled via build config 00:03:28.947 test-regex: explicitly disabled via build config 00:03:28.947 test-sad: explicitly disabled via build config 00:03:28.947 test-security-perf: explicitly disabled via build config 00:03:28.947 00:03:28.947 libs: 00:03:28.947 argparse: explicitly disabled via build config 00:03:28.947 metrics: explicitly disabled via build config 00:03:28.947 acl: explicitly disabled via build config 00:03:28.947 bbdev: explicitly disabled via build config 00:03:28.947 bitratestats: explicitly disabled via build config 00:03:28.947 bpf: explicitly disabled via build config 00:03:28.947 cfgfile: explicitly disabled via build config 00:03:28.947 distributor: explicitly disabled via build config 00:03:28.947 efd: explicitly disabled via build config 00:03:28.947 eventdev: explicitly disabled via build config 00:03:28.947 dispatcher: explicitly disabled via build config 00:03:28.947 gpudev: explicitly disabled via build config 00:03:28.947 gro: explicitly disabled via build config 00:03:28.947 gso: explicitly disabled via build config 00:03:28.947 ip_frag: explicitly disabled via build config 00:03:28.947 jobstats: explicitly disabled via build config 00:03:28.947 latencystats: explicitly disabled via build config 00:03:28.947 lpm: explicitly disabled via build config 00:03:28.947 member: explicitly disabled via build config 00:03:28.947 pcapng: explicitly disabled via build config 00:03:28.947 rawdev: explicitly disabled via build config 00:03:28.947 regexdev: explicitly disabled via build config 00:03:28.947 mldev: explicitly disabled via build config 00:03:28.947 rib: explicitly disabled via build config 00:03:28.947 sched: explicitly disabled via build config 00:03:28.947 stack: explicitly disabled via build config 00:03:28.947 ipsec: explicitly disabled via build config 00:03:28.947 pdcp: explicitly disabled via build config 00:03:28.947 fib: explicitly disabled via build config 00:03:28.947 port: explicitly disabled via build config 00:03:28.947 pdump: explicitly disabled via build config 00:03:28.947 table: explicitly disabled via build config 00:03:28.947 pipeline: explicitly disabled via build config 00:03:28.947 graph: explicitly disabled via build config 00:03:28.947 node: explicitly disabled via build config 00:03:28.947 00:03:28.947 drivers: 00:03:28.947 common/cpt: not in enabled drivers build config 00:03:28.947 common/dpaax: not in enabled drivers build config 00:03:28.947 common/iavf: not in enabled drivers build config 00:03:28.947 common/idpf: not in enabled drivers build config 00:03:28.947 common/ionic: not in enabled drivers build config 00:03:28.947 common/mvep: not in enabled drivers build config 00:03:28.947 common/octeontx: not in enabled drivers build config 00:03:28.947 bus/cdx: not in enabled drivers build config 00:03:28.947 bus/dpaa: not in enabled drivers build config 00:03:28.947 bus/fslmc: not in enabled drivers build config 00:03:28.947 bus/ifpga: not in enabled drivers build config 00:03:28.947 bus/platform: not in enabled drivers build config 00:03:28.947 bus/uacce: not in enabled drivers build config 00:03:28.947 bus/vmbus: not in enabled drivers build config 00:03:28.947 common/cnxk: not in enabled drivers build config 00:03:28.947 common/nfp: not in enabled drivers build config 00:03:28.947 common/nitrox: not in enabled drivers build config 00:03:28.947 common/sfc_efx: not in enabled drivers build config 00:03:28.947 mempool/bucket: not in enabled drivers build config 00:03:28.947 mempool/cnxk: not in enabled drivers build config 00:03:28.947 mempool/dpaa: not in enabled drivers build config 00:03:28.947 mempool/dpaa2: not in enabled drivers build config 00:03:28.947 mempool/octeontx: not in enabled drivers build config 00:03:28.947 mempool/stack: not in enabled drivers build config 00:03:28.947 dma/cnxk: not in enabled drivers build config 00:03:28.947 dma/dpaa: not in enabled drivers build config 00:03:28.947 dma/dpaa2: not in enabled drivers build config 00:03:28.947 dma/hisilicon: not in enabled drivers build config 00:03:28.947 dma/idxd: not in enabled drivers build config 00:03:28.947 dma/ioat: not in enabled drivers build config 00:03:28.947 dma/skeleton: not in enabled drivers build config 00:03:28.947 net/af_packet: not in enabled drivers build config 00:03:28.947 net/af_xdp: not in enabled drivers build config 00:03:28.947 net/ark: not in enabled drivers build config 00:03:28.947 net/atlantic: not in enabled drivers build config 00:03:28.947 net/avp: not in enabled drivers build config 00:03:28.947 net/axgbe: not in enabled drivers build config 00:03:28.947 net/bnx2x: not in enabled drivers build config 00:03:28.947 net/bnxt: not in enabled drivers build config 00:03:28.947 net/bonding: not in enabled drivers build config 00:03:28.947 net/cnxk: not in enabled drivers build config 00:03:28.947 net/cpfl: not in enabled drivers build config 00:03:28.947 net/cxgbe: not in enabled drivers build config 00:03:28.947 net/dpaa: not in enabled drivers build config 00:03:28.947 net/dpaa2: not in enabled drivers build config 00:03:28.947 net/e1000: not in enabled drivers build config 00:03:28.947 net/ena: not in enabled drivers build config 00:03:28.947 net/enetc: not in enabled drivers build config 00:03:28.947 net/enetfec: not in enabled drivers build config 00:03:28.947 net/enic: not in enabled drivers build config 00:03:28.947 net/failsafe: not in enabled drivers build config 00:03:28.947 net/fm10k: not in enabled drivers build config 00:03:28.947 net/gve: not in enabled drivers build config 00:03:28.947 net/hinic: not in enabled drivers build config 00:03:28.947 net/hns3: not in enabled drivers build config 00:03:28.947 net/i40e: not in enabled drivers build config 00:03:28.947 net/iavf: not in enabled drivers build config 00:03:28.947 net/ice: not in enabled drivers build config 00:03:28.947 net/idpf: not in enabled drivers build config 00:03:28.947 net/igc: not in enabled drivers build config 00:03:28.947 net/ionic: not in enabled drivers build config 00:03:28.947 net/ipn3ke: not in enabled drivers build config 00:03:28.947 net/ixgbe: not in enabled drivers build config 00:03:28.947 net/mana: not in enabled drivers build config 00:03:28.947 net/memif: not in enabled drivers build config 00:03:28.948 net/mlx4: not in enabled drivers build config 00:03:28.948 net/mlx5: not in enabled drivers build config 00:03:28.948 net/mvneta: not in enabled drivers build config 00:03:28.948 net/mvpp2: not in enabled drivers build config 00:03:28.948 net/netvsc: not in enabled drivers build config 00:03:28.948 net/nfb: not in enabled drivers build config 00:03:28.948 net/nfp: not in enabled drivers build config 00:03:28.948 net/ngbe: not in enabled drivers build config 00:03:28.948 net/null: not in enabled drivers build config 00:03:28.948 net/octeontx: not in enabled drivers build config 00:03:28.948 net/octeon_ep: not in enabled drivers build config 00:03:28.948 net/pcap: not in enabled drivers build config 00:03:28.948 net/pfe: not in enabled drivers build config 00:03:28.948 net/qede: not in enabled drivers build config 00:03:28.948 net/ring: not in enabled drivers build config 00:03:28.948 net/sfc: not in enabled drivers build config 00:03:28.948 net/softnic: not in enabled drivers build config 00:03:28.948 net/tap: not in enabled drivers build config 00:03:28.948 net/thunderx: not in enabled drivers build config 00:03:28.948 net/txgbe: not in enabled drivers build config 00:03:28.948 net/vdev_netvsc: not in enabled drivers build config 00:03:28.948 net/vhost: not in enabled drivers build config 00:03:28.948 net/virtio: not in enabled drivers build config 00:03:28.948 net/vmxnet3: not in enabled drivers build config 00:03:28.948 raw/*: missing internal dependency, "rawdev" 00:03:28.948 crypto/armv8: not in enabled drivers build config 00:03:28.948 crypto/bcmfs: not in enabled drivers build config 00:03:28.948 crypto/caam_jr: not in enabled drivers build config 00:03:28.948 crypto/ccp: not in enabled drivers build config 00:03:28.948 crypto/cnxk: not in enabled drivers build config 00:03:28.948 crypto/dpaa_sec: not in enabled drivers build config 00:03:28.948 crypto/dpaa2_sec: not in enabled drivers build config 00:03:28.948 crypto/mvsam: not in enabled drivers build config 00:03:28.948 crypto/nitrox: not in enabled drivers build config 00:03:28.948 crypto/null: not in enabled drivers build config 00:03:28.948 crypto/octeontx: not in enabled drivers build config 00:03:28.948 crypto/openssl: not in enabled drivers build config 00:03:28.948 crypto/scheduler: not in enabled drivers build config 00:03:28.948 crypto/uadk: not in enabled drivers build config 00:03:28.948 crypto/virtio: not in enabled drivers build config 00:03:28.948 compress/nitrox: not in enabled drivers build config 00:03:28.948 compress/octeontx: not in enabled drivers build config 00:03:28.948 compress/zlib: not in enabled drivers build config 00:03:28.948 regex/*: missing internal dependency, "regexdev" 00:03:28.948 ml/*: missing internal dependency, "mldev" 00:03:28.948 vdpa/ifc: not in enabled drivers build config 00:03:28.948 vdpa/mlx5: not in enabled drivers build config 00:03:28.948 vdpa/nfp: not in enabled drivers build config 00:03:28.948 vdpa/sfc: not in enabled drivers build config 00:03:28.948 event/*: missing internal dependency, "eventdev" 00:03:28.948 baseband/*: missing internal dependency, "bbdev" 00:03:28.948 gpu/*: missing internal dependency, "gpudev" 00:03:28.948 00:03:28.948 00:03:28.948 Build targets in project: 115 00:03:28.948 00:03:28.948 DPDK 24.03.0 00:03:28.948 00:03:28.948 User defined options 00:03:28.948 buildtype : debug 00:03:28.948 default_library : shared 00:03:28.948 libdir : lib 00:03:28.948 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:03:28.948 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:03:28.948 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:03:28.948 cpu_instruction_set: native 00:03:28.948 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:03:28.948 disable_libs : bbdev,argparse,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:03:28.948 enable_docs : false 00:03:28.948 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:03:28.948 enable_kmods : false 00:03:28.948 max_lcores : 128 00:03:28.948 tests : false 00:03:28.948 00:03:28.948 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:28.948 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:03:28.948 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:03:28.948 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:03:28.948 [3/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:03:28.948 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:03:28.948 [5/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:03:29.211 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:03:29.211 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:03:29.211 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:03:29.211 [9/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:03:29.211 [10/378] Linking static target lib/librte_kvargs.a 00:03:29.211 [11/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:03:29.211 [12/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:03:29.211 [13/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:03:29.211 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:03:29.211 [15/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:03:29.211 [16/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:03:29.211 [17/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:03:29.211 [18/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:03:29.211 [19/378] Linking static target lib/librte_log.a 00:03:29.211 [20/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:03:29.211 [21/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:03:29.211 [22/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:03:29.211 [23/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:03:29.211 [24/378] Linking static target lib/librte_pci.a 00:03:29.211 [25/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:03:29.474 [26/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:03:29.474 [27/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:03:29.474 [28/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:03:29.474 [29/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:03:29.474 [30/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:03:29.474 [31/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:03:29.474 [32/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:03:29.474 [33/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:03:29.474 [34/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:03:29.474 [35/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:03:29.739 [36/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:03:29.739 [37/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:03:29.739 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:03:29.739 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:03:29.739 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:03:29.739 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:03:29.739 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:03:29.739 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:03:29.739 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:03:29.739 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:03:29.739 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:03:29.739 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:03:29.739 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:03:29.739 [49/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:03:29.739 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:03:29.739 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:03:29.739 [52/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:03:29.739 [53/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:03:29.739 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:03:29.739 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:03:29.739 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:03:29.739 [57/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:03:29.739 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:03:29.739 [59/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:03:29.739 [60/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:03:29.739 [61/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:03:29.739 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:03:29.739 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:03:29.739 [64/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:03:29.739 [65/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:03:29.739 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:03:29.739 [67/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:03:29.739 [68/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:03:29.739 [69/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:03:29.739 [70/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:03:29.739 [71/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:03:29.739 [72/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:03:29.739 [73/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:03:29.739 [74/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:03:29.739 [75/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:03:29.739 [76/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:03:29.739 [77/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:03:29.739 [78/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:03:29.739 [79/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:03:29.739 [80/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:03:29.739 [81/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:03:29.739 [82/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:03:29.739 [83/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:03:29.739 [84/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:03:29.739 [85/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:03:29.739 [86/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:03:29.739 [87/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:03:29.739 [88/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:03:29.739 [89/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:03:29.739 [90/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:29.739 [91/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:03:29.739 [92/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:03:30.002 [93/378] Linking static target lib/librte_ring.a 00:03:30.002 [94/378] Linking static target lib/librte_meter.a 00:03:30.003 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:03:30.003 [96/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:03:30.003 [97/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:03:30.003 [98/378] Linking static target lib/librte_timer.a 00:03:30.003 [99/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:03:30.003 [100/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:03:30.003 [101/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:03:30.003 [102/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:03:30.003 [103/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:03:30.003 [104/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:03:30.003 [105/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:03:30.003 [106/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:03:30.003 [107/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:03:30.003 [108/378] Linking static target lib/librte_telemetry.a 00:03:30.003 [109/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:03:30.003 [110/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:03:30.003 [111/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:03:30.003 [112/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:03:30.003 [113/378] Linking static target lib/librte_cmdline.a 00:03:30.003 [114/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:03:30.003 [115/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:03:30.003 [116/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:03:30.003 [117/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:03:30.003 [118/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:03:30.003 [119/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:03:30.003 [120/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:03:30.003 [121/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:03:30.003 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:03:30.003 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:03:30.003 [124/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:03:30.003 [125/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:03:30.003 [126/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:03:30.003 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:03:30.003 [128/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:03:30.003 [129/378] Linking static target lib/librte_net.a 00:03:30.003 [130/378] Linking static target lib/librte_rcu.a 00:03:30.003 [131/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:03:30.003 [132/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:03:30.003 [133/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:03:30.003 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:03:30.003 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:03:30.003 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:03:30.003 [137/378] Linking static target lib/librte_eal.a 00:03:30.003 [138/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:03:30.003 [139/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:03:30.003 [140/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:03:30.262 [141/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:03:30.262 [142/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:03:30.262 [143/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:03:30.262 [144/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:03:30.262 [145/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:03:30.262 [146/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:03:30.262 [147/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:03:30.262 [148/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:03:30.262 [149/378] Linking static target lib/librte_compressdev.a 00:03:30.262 [150/378] Linking static target lib/librte_dmadev.a 00:03:30.262 [151/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:03:30.262 [152/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:03:30.262 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:03:30.262 [154/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.262 [155/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:03:30.262 [156/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:03:30.262 [157/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:03:30.520 [158/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:03:30.520 [159/378] Linking target lib/librte_log.so.24.1 00:03:30.520 [160/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:03:30.520 [161/378] Linking static target lib/librte_mbuf.a 00:03:30.520 [162/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.520 [163/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:03:30.520 [164/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:03:30.520 [165/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.520 [166/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:03:30.520 [167/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.520 [168/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.520 [169/378] Linking static target lib/librte_reorder.a 00:03:30.520 [170/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:03:30.520 [171/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.520 [172/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:03:30.520 [173/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:03:30.520 [174/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:03:30.520 [175/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:03:30.520 [176/378] Linking static target lib/librte_hash.a 00:03:30.520 [177/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:03:30.520 [178/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:03:30.520 [179/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:03:30.520 [180/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:03:30.520 [181/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:03:30.520 [182/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:03:30.520 [183/378] Linking static target lib/librte_power.a 00:03:30.520 [184/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:03:30.520 [185/378] Linking target lib/librte_kvargs.so.24.1 00:03:30.520 [186/378] Linking static target lib/librte_security.a 00:03:30.779 [187/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:03:30.779 [188/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:03:30.779 [189/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:03:30.779 [190/378] Linking static target lib/librte_mempool.a 00:03:30.779 [191/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:03:30.779 [192/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:03:30.779 [193/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:03:30.779 [194/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.779 [195/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:03:30.779 [196/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:03:30.779 [197/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:03:30.779 [198/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:03:30.779 [199/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:03:30.779 [200/378] Linking target lib/librte_telemetry.so.24.1 00:03:30.779 [201/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:03:30.779 [202/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:03:30.779 [203/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:03:30.779 [204/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:03:30.779 [205/378] Linking static target lib/librte_cryptodev.a 00:03:30.779 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:03:30.779 [207/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:03:30.779 [208/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:03:30.779 [209/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:03:30.779 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:03:30.779 [211/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:03:30.779 [212/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:03:30.779 [213/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:03:30.779 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:03:30.779 [215/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:03:30.779 [216/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:03:30.779 [217/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:03:30.779 [218/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:03:30.779 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:03:30.779 [220/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:03:30.779 [221/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:03:30.779 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:03:30.779 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:03:30.779 [224/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:03:30.779 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:03:30.779 [226/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:30.779 [227/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:30.779 [228/378] Linking static target drivers/librte_bus_vdev.a 00:03:30.779 [229/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:03:30.779 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:03:30.779 [231/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:30.779 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:03:30.779 [233/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:30.779 [234/378] Linking static target drivers/librte_bus_auxiliary.a 00:03:30.779 [235/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.779 [236/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:03:30.779 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:03:30.779 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:03:30.779 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:03:30.779 [240/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:03:31.038 [241/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:03:31.038 [242/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.038 [243/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:03:31.038 [244/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:03:31.038 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:03:31.038 [246/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:03:31.038 [247/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:03:31.038 [248/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:03:31.038 [249/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:03:31.038 [250/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.038 [251/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:03:31.038 [252/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:03:31.038 [253/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:03:31.038 [254/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:03:31.038 [255/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:03:31.038 [256/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:03:31.038 [257/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:03:31.038 [258/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:03:31.038 [259/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:03:31.038 [260/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:03:31.038 [261/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:03:31.038 [262/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:03:31.038 [263/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:03:31.038 [264/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:03:31.038 [265/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:03:31.038 [266/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:03:31.038 [267/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:03:31.038 [268/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:03:31.038 [269/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:31.038 [270/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:03:31.038 [271/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:31.038 [272/378] Linking static target drivers/librte_mempool_ring.a 00:03:31.038 [273/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:03:31.296 [274/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:03:31.296 [275/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.296 [276/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:03:31.296 [277/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.296 [278/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:03:31.296 [279/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:31.296 [280/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.296 [281/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:03:31.296 [282/378] Linking static target drivers/librte_compress_mlx5.a 00:03:31.296 [283/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:31.296 [284/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:03:31.296 [285/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:03:31.296 [286/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:03:31.296 [287/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:03:31.296 [288/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:31.296 [289/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.296 [290/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.296 [291/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:03:31.296 [292/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:03:31.296 [293/378] Linking static target drivers/librte_crypto_mlx5.a 00:03:31.296 [294/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:31.296 [295/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:31.296 [296/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:03:31.296 [297/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:31.296 [298/378] Linking static target drivers/librte_compress_isal.a 00:03:31.555 [299/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:03:31.555 [300/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:03:31.555 [301/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:31.555 [302/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:31.555 [303/378] Linking static target drivers/librte_bus_pci.a 00:03:31.555 [304/378] Linking static target lib/librte_ethdev.a 00:03:31.555 [305/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.555 [306/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:03:31.555 [307/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:03:31.555 [308/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:31.555 [309/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:31.555 [310/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.555 [311/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:03:31.555 [312/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:31.555 [313/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:31.555 [314/378] Linking static target drivers/librte_common_mlx5.a 00:03:31.812 [315/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:03:31.812 [316/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:03:32.072 [317/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:03:32.072 [318/378] Linking static target drivers/libtmp_rte_common_qat.a 00:03:32.330 [319/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:03:32.330 [320/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:32.330 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:32.330 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:32.330 [323/378] Linking static target drivers/librte_common_qat.a 00:03:32.897 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:03:32.897 [325/378] Linking static target lib/librte_vhost.a 00:03:32.897 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:35.431 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.967 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:03:41.254 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:43.155 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:03:43.155 [331/378] Linking target lib/librte_eal.so.24.1 00:03:43.155 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:03:43.415 [333/378] Linking target lib/librte_ring.so.24.1 00:03:43.415 [334/378] Linking target lib/librte_meter.so.24.1 00:03:43.415 [335/378] Linking target lib/librte_dmadev.so.24.1 00:03:43.415 [336/378] Linking target lib/librte_timer.so.24.1 00:03:43.415 [337/378] Linking target lib/librte_pci.so.24.1 00:03:43.415 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:03:43.415 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:03:43.415 [340/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:03:43.415 [341/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:03:43.415 [342/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:03:43.415 [343/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:03:43.415 [344/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:03:43.415 [345/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:03:43.415 [346/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:03:43.415 [347/378] Linking target lib/librte_mempool.so.24.1 00:03:43.415 [348/378] Linking target lib/librte_rcu.so.24.1 00:03:43.415 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:03:43.706 [350/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:03:43.706 [351/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:03:43.706 [352/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:03:43.706 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:03:43.706 [354/378] Linking target lib/librte_mbuf.so.24.1 00:03:43.968 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:03:43.968 [356/378] Linking target lib/librte_reorder.so.24.1 00:03:43.968 [357/378] Linking target lib/librte_compressdev.so.24.1 00:03:43.968 [358/378] Linking target lib/librte_net.so.24.1 00:03:43.968 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:03:43.968 [360/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:03:43.968 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:03:43.968 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:03:44.227 [363/378] Linking target lib/librte_hash.so.24.1 00:03:44.227 [364/378] Linking target lib/librte_cmdline.so.24.1 00:03:44.227 [365/378] Linking target lib/librte_security.so.24.1 00:03:44.227 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:03:44.227 [367/378] Linking target lib/librte_ethdev.so.24.1 00:03:44.227 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:03:44.227 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:03:44.227 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:03:44.486 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:03:44.486 [372/378] Linking target lib/librte_power.so.24.1 00:03:44.486 [373/378] Linking target lib/librte_vhost.so.24.1 00:03:44.486 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:03:44.486 [375/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:03:44.486 [376/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:03:44.486 [377/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:03:44.747 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:03:44.747 INFO: autodetecting backend as ninja 00:03:44.747 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 112 00:03:46.126 CC lib/log/log.o 00:03:46.126 CC lib/log/log_flags.o 00:03:46.126 CC lib/log/log_deprecated.o 00:03:46.126 CC lib/ut/ut.o 00:03:46.126 CC lib/ut_mock/mock.o 00:03:46.126 LIB libspdk_ut.a 00:03:46.126 LIB libspdk_log.a 00:03:46.126 LIB libspdk_ut_mock.a 00:03:46.126 SO libspdk_ut.so.2.0 00:03:46.126 SO libspdk_log.so.7.0 00:03:46.126 SO libspdk_ut_mock.so.6.0 00:03:46.126 SYMLINK libspdk_ut.so 00:03:46.126 SYMLINK libspdk_log.so 00:03:46.126 SYMLINK libspdk_ut_mock.so 00:03:46.695 CXX lib/trace_parser/trace.o 00:03:46.695 CC lib/dma/dma.o 00:03:46.695 CC lib/util/base64.o 00:03:46.695 CC lib/util/bit_array.o 00:03:46.695 CC lib/util/cpuset.o 00:03:46.695 CC lib/util/crc16.o 00:03:46.695 CC lib/util/crc32.o 00:03:46.695 CC lib/ioat/ioat.o 00:03:46.695 CC lib/util/crc64.o 00:03:46.695 CC lib/util/crc32c.o 00:03:46.695 CC lib/util/crc32_ieee.o 00:03:46.695 CC lib/util/dif.o 00:03:46.695 CC lib/util/fd.o 00:03:46.695 CC lib/util/fd_group.o 00:03:46.695 CC lib/util/file.o 00:03:46.695 CC lib/util/hexlify.o 00:03:46.695 CC lib/util/iov.o 00:03:46.695 CC lib/util/math.o 00:03:46.695 CC lib/util/net.o 00:03:46.695 CC lib/util/pipe.o 00:03:46.695 CC lib/util/strerror_tls.o 00:03:46.695 CC lib/util/string.o 00:03:46.695 CC lib/util/uuid.o 00:03:46.695 CC lib/util/zipf.o 00:03:46.695 CC lib/util/xor.o 00:03:46.695 CC lib/vfio_user/host/vfio_user.o 00:03:46.695 CC lib/vfio_user/host/vfio_user_pci.o 00:03:46.695 LIB libspdk_dma.a 00:03:46.695 SO libspdk_dma.so.4.0 00:03:46.954 LIB libspdk_ioat.a 00:03:46.954 SYMLINK libspdk_dma.so 00:03:46.954 SO libspdk_ioat.so.7.0 00:03:46.954 SYMLINK libspdk_ioat.so 00:03:46.954 LIB libspdk_vfio_user.a 00:03:46.954 SO libspdk_vfio_user.so.5.0 00:03:47.213 LIB libspdk_util.a 00:03:47.213 SYMLINK libspdk_vfio_user.so 00:03:47.213 SO libspdk_util.so.10.0 00:03:47.472 SYMLINK libspdk_util.so 00:03:47.472 LIB libspdk_trace_parser.a 00:03:47.472 SO libspdk_trace_parser.so.5.0 00:03:47.472 SYMLINK libspdk_trace_parser.so 00:03:47.732 CC lib/env_dpdk/env.o 00:03:47.732 CC lib/env_dpdk/memory.o 00:03:47.732 CC lib/conf/conf.o 00:03:47.732 CC lib/env_dpdk/pci.o 00:03:47.732 CC lib/json/json_parse.o 00:03:47.732 CC lib/env_dpdk/threads.o 00:03:47.732 CC lib/env_dpdk/init.o 00:03:47.732 CC lib/json/json_util.o 00:03:47.732 CC lib/json/json_write.o 00:03:47.732 CC lib/env_dpdk/pci_ioat.o 00:03:47.732 CC lib/env_dpdk/pci_virtio.o 00:03:47.732 CC lib/env_dpdk/pci_vmd.o 00:03:47.732 CC lib/reduce/reduce.o 00:03:47.732 CC lib/env_dpdk/pci_event.o 00:03:47.732 CC lib/env_dpdk/pci_idxd.o 00:03:47.732 CC lib/env_dpdk/sigbus_handler.o 00:03:47.732 CC lib/env_dpdk/pci_dpdk.o 00:03:47.732 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:47.732 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:47.732 CC lib/rdma_utils/rdma_utils.o 00:03:47.732 CC lib/vmd/vmd.o 00:03:47.732 CC lib/vmd/led.o 00:03:47.732 CC lib/idxd/idxd.o 00:03:47.732 CC lib/rdma_provider/common.o 00:03:47.732 CC lib/idxd/idxd_user.o 00:03:47.732 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:47.732 CC lib/idxd/idxd_kernel.o 00:03:47.991 LIB libspdk_rdma_provider.a 00:03:47.991 LIB libspdk_conf.a 00:03:47.991 SO libspdk_rdma_provider.so.6.0 00:03:47.991 SO libspdk_conf.so.6.0 00:03:47.991 LIB libspdk_json.a 00:03:47.991 SYMLINK libspdk_rdma_provider.so 00:03:47.991 SYMLINK libspdk_conf.so 00:03:48.251 SO libspdk_json.so.6.0 00:03:48.251 SYMLINK libspdk_json.so 00:03:48.251 LIB libspdk_idxd.a 00:03:48.251 LIB libspdk_rdma_utils.a 00:03:48.251 SO libspdk_idxd.so.12.0 00:03:48.251 SO libspdk_rdma_utils.so.1.0 00:03:48.509 LIB libspdk_vmd.a 00:03:48.509 LIB libspdk_reduce.a 00:03:48.509 SYMLINK libspdk_idxd.so 00:03:48.509 SO libspdk_vmd.so.6.0 00:03:48.509 SO libspdk_reduce.so.6.1 00:03:48.509 SYMLINK libspdk_rdma_utils.so 00:03:48.509 SYMLINK libspdk_vmd.so 00:03:48.509 SYMLINK libspdk_reduce.so 00:03:48.509 CC lib/jsonrpc/jsonrpc_server.o 00:03:48.509 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:48.510 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:48.510 CC lib/jsonrpc/jsonrpc_client.o 00:03:48.769 LIB libspdk_jsonrpc.a 00:03:48.769 SO libspdk_jsonrpc.so.6.0 00:03:49.028 SYMLINK libspdk_jsonrpc.so 00:03:49.028 LIB libspdk_env_dpdk.a 00:03:49.028 SO libspdk_env_dpdk.so.15.0 00:03:49.287 SYMLINK libspdk_env_dpdk.so 00:03:49.287 CC lib/rpc/rpc.o 00:03:49.547 LIB libspdk_rpc.a 00:03:49.547 SO libspdk_rpc.so.6.0 00:03:49.547 SYMLINK libspdk_rpc.so 00:03:50.115 CC lib/notify/notify.o 00:03:50.115 CC lib/notify/notify_rpc.o 00:03:50.115 CC lib/keyring/keyring.o 00:03:50.115 CC lib/keyring/keyring_rpc.o 00:03:50.115 CC lib/trace/trace_flags.o 00:03:50.115 CC lib/trace/trace.o 00:03:50.115 CC lib/trace/trace_rpc.o 00:03:50.115 LIB libspdk_notify.a 00:03:50.115 SO libspdk_notify.so.6.0 00:03:50.373 LIB libspdk_keyring.a 00:03:50.373 LIB libspdk_trace.a 00:03:50.373 SYMLINK libspdk_notify.so 00:03:50.373 SO libspdk_keyring.so.1.0 00:03:50.373 SO libspdk_trace.so.10.0 00:03:50.373 SYMLINK libspdk_keyring.so 00:03:50.373 SYMLINK libspdk_trace.so 00:03:50.941 CC lib/sock/sock.o 00:03:50.941 CC lib/sock/sock_rpc.o 00:03:50.941 CC lib/thread/thread.o 00:03:50.941 CC lib/thread/iobuf.o 00:03:51.200 LIB libspdk_sock.a 00:03:51.200 SO libspdk_sock.so.10.0 00:03:51.200 SYMLINK libspdk_sock.so 00:03:51.766 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:51.766 CC lib/nvme/nvme_ctrlr.o 00:03:51.766 CC lib/nvme/nvme_fabric.o 00:03:51.766 CC lib/nvme/nvme_ns_cmd.o 00:03:51.766 CC lib/nvme/nvme_pcie.o 00:03:51.766 CC lib/nvme/nvme_ns.o 00:03:51.766 CC lib/nvme/nvme_pcie_common.o 00:03:51.766 CC lib/nvme/nvme_qpair.o 00:03:51.766 CC lib/nvme/nvme.o 00:03:51.766 CC lib/nvme/nvme_quirks.o 00:03:51.766 CC lib/nvme/nvme_transport.o 00:03:51.766 CC lib/nvme/nvme_discovery.o 00:03:51.766 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:51.766 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:51.766 CC lib/nvme/nvme_tcp.o 00:03:51.766 CC lib/nvme/nvme_opal.o 00:03:51.766 CC lib/nvme/nvme_io_msg.o 00:03:51.766 CC lib/nvme/nvme_stubs.o 00:03:51.766 CC lib/nvme/nvme_poll_group.o 00:03:51.766 CC lib/nvme/nvme_zns.o 00:03:51.766 CC lib/nvme/nvme_auth.o 00:03:51.766 CC lib/nvme/nvme_cuse.o 00:03:51.766 CC lib/nvme/nvme_rdma.o 00:03:52.332 LIB libspdk_thread.a 00:03:52.332 SO libspdk_thread.so.10.1 00:03:52.332 SYMLINK libspdk_thread.so 00:03:52.590 CC lib/accel/accel.o 00:03:52.590 CC lib/accel/accel_rpc.o 00:03:52.590 CC lib/accel/accel_sw.o 00:03:52.590 CC lib/blob/blobstore.o 00:03:52.590 CC lib/blob/request.o 00:03:52.590 CC lib/blob/zeroes.o 00:03:52.590 CC lib/blob/blob_bs_dev.o 00:03:52.590 CC lib/init/subsystem_rpc.o 00:03:52.590 CC lib/init/json_config.o 00:03:52.590 CC lib/init/subsystem.o 00:03:52.590 CC lib/init/rpc.o 00:03:52.590 CC lib/virtio/virtio.o 00:03:52.590 CC lib/virtio/virtio_vhost_user.o 00:03:52.590 CC lib/virtio/virtio_vfio_user.o 00:03:52.590 CC lib/virtio/virtio_pci.o 00:03:52.848 LIB libspdk_init.a 00:03:53.107 SO libspdk_init.so.5.0 00:03:53.107 LIB libspdk_virtio.a 00:03:53.107 SYMLINK libspdk_init.so 00:03:53.107 SO libspdk_virtio.so.7.0 00:03:53.107 SYMLINK libspdk_virtio.so 00:03:53.366 CC lib/event/app.o 00:03:53.366 CC lib/event/log_rpc.o 00:03:53.366 CC lib/event/reactor.o 00:03:53.366 CC lib/event/app_rpc.o 00:03:53.366 CC lib/event/scheduler_static.o 00:03:53.625 LIB libspdk_accel.a 00:03:53.625 SO libspdk_accel.so.16.0 00:03:53.625 LIB libspdk_nvme.a 00:03:53.625 SYMLINK libspdk_accel.so 00:03:53.884 SO libspdk_nvme.so.13.1 00:03:53.884 LIB libspdk_event.a 00:03:53.884 SO libspdk_event.so.14.0 00:03:54.142 SYMLINK libspdk_event.so 00:03:54.142 CC lib/bdev/bdev_rpc.o 00:03:54.142 CC lib/bdev/bdev.o 00:03:54.142 CC lib/bdev/bdev_zone.o 00:03:54.142 CC lib/bdev/part.o 00:03:54.142 CC lib/bdev/scsi_nvme.o 00:03:54.142 SYMLINK libspdk_nvme.so 00:03:55.520 LIB libspdk_blob.a 00:03:55.520 SO libspdk_blob.so.11.0 00:03:55.520 SYMLINK libspdk_blob.so 00:03:56.089 CC lib/lvol/lvol.o 00:03:56.089 CC lib/blobfs/blobfs.o 00:03:56.089 CC lib/blobfs/tree.o 00:03:56.690 LIB libspdk_bdev.a 00:03:56.690 SO libspdk_bdev.so.16.0 00:03:56.690 LIB libspdk_blobfs.a 00:03:56.953 SYMLINK libspdk_bdev.so 00:03:56.953 SO libspdk_blobfs.so.10.0 00:03:56.953 LIB libspdk_lvol.a 00:03:56.953 SO libspdk_lvol.so.10.0 00:03:56.953 SYMLINK libspdk_blobfs.so 00:03:56.953 SYMLINK libspdk_lvol.so 00:03:57.215 CC lib/ftl/ftl_core.o 00:03:57.215 CC lib/ftl/ftl_init.o 00:03:57.215 CC lib/ftl/ftl_layout.o 00:03:57.215 CC lib/ftl/ftl_debug.o 00:03:57.215 CC lib/ftl/ftl_io.o 00:03:57.215 CC lib/ftl/ftl_sb.o 00:03:57.215 CC lib/scsi/dev.o 00:03:57.215 CC lib/ftl/ftl_l2p.o 00:03:57.215 CC lib/scsi/lun.o 00:03:57.215 CC lib/nvmf/ctrlr.o 00:03:57.215 CC lib/nvmf/ctrlr_bdev.o 00:03:57.215 CC lib/ftl/ftl_band_ops.o 00:03:57.215 CC lib/ublk/ublk_rpc.o 00:03:57.215 CC lib/scsi/port.o 00:03:57.215 CC lib/ftl/ftl_nv_cache.o 00:03:57.215 CC lib/nbd/nbd.o 00:03:57.215 CC lib/nvmf/ctrlr_discovery.o 00:03:57.215 CC lib/ublk/ublk.o 00:03:57.215 CC lib/ftl/ftl_l2p_flat.o 00:03:57.215 CC lib/ftl/ftl_band.o 00:03:57.215 CC lib/scsi/scsi.o 00:03:57.215 CC lib/nbd/nbd_rpc.o 00:03:57.215 CC lib/scsi/scsi_bdev.o 00:03:57.215 CC lib/scsi/scsi_rpc.o 00:03:57.215 CC lib/nvmf/subsystem.o 00:03:57.215 CC lib/ftl/ftl_writer.o 00:03:57.215 CC lib/scsi/scsi_pr.o 00:03:57.215 CC lib/nvmf/nvmf.o 00:03:57.215 CC lib/ftl/ftl_rq.o 00:03:57.215 CC lib/ftl/ftl_reloc.o 00:03:57.215 CC lib/nvmf/nvmf_rpc.o 00:03:57.215 CC lib/scsi/task.o 00:03:57.215 CC lib/nvmf/transport.o 00:03:57.215 CC lib/ftl/ftl_l2p_cache.o 00:03:57.215 CC lib/nvmf/tcp.o 00:03:57.215 CC lib/ftl/ftl_p2l.o 00:03:57.215 CC lib/nvmf/stubs.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt.o 00:03:57.215 CC lib/nvmf/mdns_server.o 00:03:57.215 CC lib/nvmf/rdma.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:57.215 CC lib/nvmf/auth.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:57.215 CC lib/ftl/utils/ftl_conf.o 00:03:57.215 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:57.215 CC lib/ftl/utils/ftl_mempool.o 00:03:57.215 CC lib/ftl/utils/ftl_md.o 00:03:57.215 CC lib/ftl/utils/ftl_bitmap.o 00:03:57.215 CC lib/ftl/utils/ftl_property.o 00:03:57.215 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:57.215 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:57.215 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:57.215 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:57.215 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:57.215 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:57.215 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:57.215 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:57.215 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:57.215 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:57.215 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:57.215 CC lib/ftl/base/ftl_base_dev.o 00:03:57.215 CC lib/ftl/base/ftl_base_bdev.o 00:03:57.215 CC lib/ftl/ftl_trace.o 00:03:57.782 LIB libspdk_nbd.a 00:03:57.782 SO libspdk_nbd.so.7.0 00:03:57.782 LIB libspdk_scsi.a 00:03:57.782 SYMLINK libspdk_nbd.so 00:03:57.782 SO libspdk_scsi.so.9.0 00:03:58.041 SYMLINK libspdk_scsi.so 00:03:58.041 LIB libspdk_ublk.a 00:03:58.041 SO libspdk_ublk.so.3.0 00:03:58.041 SYMLINK libspdk_ublk.so 00:03:58.300 CC lib/vhost/vhost.o 00:03:58.300 CC lib/iscsi/conn.o 00:03:58.300 LIB libspdk_ftl.a 00:03:58.300 CC lib/iscsi/init_grp.o 00:03:58.300 CC lib/iscsi/md5.o 00:03:58.300 CC lib/vhost/vhost_scsi.o 00:03:58.300 CC lib/iscsi/iscsi.o 00:03:58.300 CC lib/vhost/vhost_rpc.o 00:03:58.300 CC lib/iscsi/param.o 00:03:58.300 CC lib/vhost/vhost_blk.o 00:03:58.300 CC lib/iscsi/portal_grp.o 00:03:58.300 CC lib/vhost/rte_vhost_user.o 00:03:58.300 CC lib/iscsi/iscsi_rpc.o 00:03:58.300 CC lib/iscsi/tgt_node.o 00:03:58.300 CC lib/iscsi/iscsi_subsystem.o 00:03:58.300 CC lib/iscsi/task.o 00:03:58.558 SO libspdk_ftl.so.9.0 00:03:58.816 SYMLINK libspdk_ftl.so 00:03:59.383 LIB libspdk_nvmf.a 00:03:59.383 LIB libspdk_vhost.a 00:03:59.383 SO libspdk_vhost.so.8.0 00:03:59.383 SO libspdk_nvmf.so.19.0 00:03:59.383 SYMLINK libspdk_vhost.so 00:03:59.642 SYMLINK libspdk_nvmf.so 00:03:59.642 LIB libspdk_iscsi.a 00:03:59.642 SO libspdk_iscsi.so.8.0 00:03:59.901 SYMLINK libspdk_iscsi.so 00:04:00.471 CC module/env_dpdk/env_dpdk_rpc.o 00:04:00.471 CC module/sock/posix/posix.o 00:04:00.471 CC module/keyring/linux/keyring.o 00:04:00.471 CC module/keyring/linux/keyring_rpc.o 00:04:00.471 CC module/accel/dsa/accel_dsa.o 00:04:00.471 CC module/accel/dsa/accel_dsa_rpc.o 00:04:00.471 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:00.471 LIB libspdk_env_dpdk_rpc.a 00:04:00.471 CC module/accel/error/accel_error.o 00:04:00.471 CC module/accel/error/accel_error_rpc.o 00:04:00.471 CC module/blob/bdev/blob_bdev.o 00:04:00.471 CC module/accel/iaa/accel_iaa.o 00:04:00.471 CC module/accel/iaa/accel_iaa_rpc.o 00:04:00.471 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:04:00.471 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:04:00.471 CC module/scheduler/gscheduler/gscheduler.o 00:04:00.471 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:00.471 CC module/keyring/file/keyring.o 00:04:00.471 CC module/accel/ioat/accel_ioat.o 00:04:00.471 CC module/keyring/file/keyring_rpc.o 00:04:00.471 CC module/accel/ioat/accel_ioat_rpc.o 00:04:00.729 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:04:00.729 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:04:00.729 SO libspdk_env_dpdk_rpc.so.6.0 00:04:00.729 SYMLINK libspdk_env_dpdk_rpc.so 00:04:00.729 LIB libspdk_keyring_linux.a 00:04:00.729 LIB libspdk_accel_error.a 00:04:00.729 LIB libspdk_scheduler_dpdk_governor.a 00:04:00.729 LIB libspdk_scheduler_gscheduler.a 00:04:00.729 LIB libspdk_keyring_file.a 00:04:00.729 LIB libspdk_accel_dsa.a 00:04:00.729 SO libspdk_keyring_linux.so.1.0 00:04:00.729 SO libspdk_accel_error.so.2.0 00:04:00.729 SO libspdk_scheduler_dpdk_governor.so.4.0 00:04:00.729 SO libspdk_scheduler_gscheduler.so.4.0 00:04:00.729 LIB libspdk_scheduler_dynamic.a 00:04:00.729 LIB libspdk_accel_iaa.a 00:04:00.729 SO libspdk_keyring_file.so.1.0 00:04:00.729 LIB libspdk_accel_ioat.a 00:04:00.729 SO libspdk_accel_dsa.so.5.0 00:04:00.990 SO libspdk_scheduler_dynamic.so.4.0 00:04:00.990 SO libspdk_accel_iaa.so.3.0 00:04:00.990 SYMLINK libspdk_keyring_linux.so 00:04:00.990 SYMLINK libspdk_accel_error.so 00:04:00.990 SO libspdk_accel_ioat.so.6.0 00:04:00.990 SYMLINK libspdk_scheduler_gscheduler.so 00:04:00.990 SYMLINK libspdk_scheduler_dpdk_governor.so 00:04:00.990 LIB libspdk_blob_bdev.a 00:04:00.990 SYMLINK libspdk_keyring_file.so 00:04:00.990 SYMLINK libspdk_accel_iaa.so 00:04:00.990 SYMLINK libspdk_scheduler_dynamic.so 00:04:00.990 SO libspdk_blob_bdev.so.11.0 00:04:00.990 SYMLINK libspdk_accel_ioat.so 00:04:00.990 SYMLINK libspdk_accel_dsa.so 00:04:00.990 SYMLINK libspdk_blob_bdev.so 00:04:01.246 LIB libspdk_sock_posix.a 00:04:01.246 SO libspdk_sock_posix.so.6.0 00:04:01.505 SYMLINK libspdk_sock_posix.so 00:04:01.505 CC module/bdev/error/vbdev_error_rpc.o 00:04:01.505 CC module/bdev/error/vbdev_error.o 00:04:01.505 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:01.505 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:01.505 CC module/bdev/crypto/vbdev_crypto.o 00:04:01.505 CC module/bdev/nvme/bdev_nvme.o 00:04:01.505 CC module/bdev/lvol/vbdev_lvol.o 00:04:01.505 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:01.505 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:04:01.505 CC module/bdev/passthru/vbdev_passthru.o 00:04:01.505 CC module/bdev/nvme/nvme_rpc.o 00:04:01.505 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:01.505 CC module/bdev/gpt/gpt.o 00:04:01.505 CC module/bdev/malloc/bdev_malloc.o 00:04:01.505 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:01.505 CC module/bdev/nvme/bdev_mdns_client.o 00:04:01.505 CC module/bdev/nvme/vbdev_opal.o 00:04:01.505 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:01.505 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:01.505 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:01.505 CC module/bdev/gpt/vbdev_gpt.o 00:04:01.505 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:01.505 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:01.505 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:01.505 CC module/bdev/delay/vbdev_delay.o 00:04:01.505 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:01.505 CC module/bdev/raid/bdev_raid.o 00:04:01.505 CC module/bdev/raid/bdev_raid_rpc.o 00:04:01.505 CC module/bdev/ftl/bdev_ftl.o 00:04:01.505 CC module/bdev/raid/bdev_raid_sb.o 00:04:01.505 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:01.505 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:01.505 CC module/bdev/raid/raid0.o 00:04:01.505 CC module/bdev/iscsi/bdev_iscsi.o 00:04:01.505 CC module/bdev/raid/raid1.o 00:04:01.505 CC module/bdev/raid/concat.o 00:04:01.505 CC module/blobfs/bdev/blobfs_bdev.o 00:04:01.505 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:01.505 CC module/bdev/split/vbdev_split.o 00:04:01.505 CC module/bdev/null/bdev_null.o 00:04:01.505 CC module/bdev/split/vbdev_split_rpc.o 00:04:01.505 CC module/bdev/compress/vbdev_compress.o 00:04:01.505 CC module/bdev/compress/vbdev_compress_rpc.o 00:04:01.505 CC module/bdev/null/bdev_null_rpc.o 00:04:01.505 CC module/bdev/aio/bdev_aio.o 00:04:01.505 CC module/bdev/aio/bdev_aio_rpc.o 00:04:01.505 LIB libspdk_accel_dpdk_compressdev.a 00:04:01.765 SO libspdk_accel_dpdk_compressdev.so.3.0 00:04:01.765 SYMLINK libspdk_accel_dpdk_compressdev.so 00:04:01.765 LIB libspdk_blobfs_bdev.a 00:04:01.765 LIB libspdk_bdev_split.a 00:04:01.765 LIB libspdk_bdev_gpt.a 00:04:01.765 LIB libspdk_bdev_error.a 00:04:01.765 SO libspdk_blobfs_bdev.so.6.0 00:04:02.024 SO libspdk_bdev_gpt.so.6.0 00:04:02.024 LIB libspdk_bdev_null.a 00:04:02.024 SO libspdk_bdev_split.so.6.0 00:04:02.024 LIB libspdk_accel_dpdk_cryptodev.a 00:04:02.024 SO libspdk_bdev_error.so.6.0 00:04:02.024 LIB libspdk_bdev_passthru.a 00:04:02.024 LIB libspdk_bdev_ftl.a 00:04:02.024 LIB libspdk_bdev_zone_block.a 00:04:02.024 SYMLINK libspdk_blobfs_bdev.so 00:04:02.024 LIB libspdk_bdev_iscsi.a 00:04:02.024 SO libspdk_bdev_null.so.6.0 00:04:02.024 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:04:02.024 LIB libspdk_bdev_crypto.a 00:04:02.024 SO libspdk_bdev_passthru.so.6.0 00:04:02.024 LIB libspdk_bdev_delay.a 00:04:02.024 SYMLINK libspdk_bdev_gpt.so 00:04:02.024 SO libspdk_bdev_zone_block.so.6.0 00:04:02.024 SO libspdk_bdev_iscsi.so.6.0 00:04:02.024 SYMLINK libspdk_bdev_error.so 00:04:02.024 SYMLINK libspdk_bdev_split.so 00:04:02.024 SO libspdk_bdev_ftl.so.6.0 00:04:02.024 LIB libspdk_bdev_aio.a 00:04:02.024 LIB libspdk_bdev_malloc.a 00:04:02.024 SO libspdk_bdev_crypto.so.6.0 00:04:02.024 SO libspdk_bdev_delay.so.6.0 00:04:02.024 LIB libspdk_bdev_compress.a 00:04:02.024 SYMLINK libspdk_bdev_null.so 00:04:02.024 SO libspdk_bdev_aio.so.6.0 00:04:02.024 SO libspdk_bdev_malloc.so.6.0 00:04:02.024 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:04:02.024 SYMLINK libspdk_bdev_passthru.so 00:04:02.024 SYMLINK libspdk_bdev_iscsi.so 00:04:02.024 SYMLINK libspdk_bdev_zone_block.so 00:04:02.024 SYMLINK libspdk_bdev_ftl.so 00:04:02.024 SO libspdk_bdev_compress.so.6.0 00:04:02.024 SYMLINK libspdk_bdev_crypto.so 00:04:02.024 SYMLINK libspdk_bdev_delay.so 00:04:02.024 SYMLINK libspdk_bdev_aio.so 00:04:02.024 LIB libspdk_bdev_virtio.a 00:04:02.024 LIB libspdk_bdev_lvol.a 00:04:02.024 SYMLINK libspdk_bdev_malloc.so 00:04:02.024 SYMLINK libspdk_bdev_compress.so 00:04:02.024 SO libspdk_bdev_lvol.so.6.0 00:04:02.024 SO libspdk_bdev_virtio.so.6.0 00:04:02.284 SYMLINK libspdk_bdev_lvol.so 00:04:02.284 SYMLINK libspdk_bdev_virtio.so 00:04:02.543 LIB libspdk_bdev_raid.a 00:04:02.543 SO libspdk_bdev_raid.so.6.0 00:04:02.802 SYMLINK libspdk_bdev_raid.so 00:04:03.739 LIB libspdk_bdev_nvme.a 00:04:03.739 SO libspdk_bdev_nvme.so.7.0 00:04:03.739 SYMLINK libspdk_bdev_nvme.so 00:04:04.675 CC module/event/subsystems/keyring/keyring.o 00:04:04.675 CC module/event/subsystems/sock/sock.o 00:04:04.675 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:04.675 CC module/event/subsystems/iobuf/iobuf.o 00:04:04.675 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:04.675 CC module/event/subsystems/vmd/vmd.o 00:04:04.675 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:04.675 CC module/event/subsystems/scheduler/scheduler.o 00:04:04.675 LIB libspdk_event_keyring.a 00:04:04.675 LIB libspdk_event_sock.a 00:04:04.675 SO libspdk_event_keyring.so.1.0 00:04:04.675 LIB libspdk_event_vhost_blk.a 00:04:04.675 LIB libspdk_event_scheduler.a 00:04:04.675 SO libspdk_event_sock.so.5.0 00:04:04.675 LIB libspdk_event_vmd.a 00:04:04.675 LIB libspdk_event_iobuf.a 00:04:04.675 SO libspdk_event_vhost_blk.so.3.0 00:04:04.675 SO libspdk_event_scheduler.so.4.0 00:04:04.675 SYMLINK libspdk_event_keyring.so 00:04:04.675 SO libspdk_event_vmd.so.6.0 00:04:04.675 SO libspdk_event_iobuf.so.3.0 00:04:04.934 SYMLINK libspdk_event_sock.so 00:04:04.934 SYMLINK libspdk_event_vhost_blk.so 00:04:04.934 SYMLINK libspdk_event_vmd.so 00:04:04.934 SYMLINK libspdk_event_iobuf.so 00:04:04.934 SYMLINK libspdk_event_scheduler.so 00:04:05.193 CC module/event/subsystems/accel/accel.o 00:04:05.453 LIB libspdk_event_accel.a 00:04:05.453 SO libspdk_event_accel.so.6.0 00:04:05.453 SYMLINK libspdk_event_accel.so 00:04:05.712 CC module/event/subsystems/bdev/bdev.o 00:04:05.971 LIB libspdk_event_bdev.a 00:04:05.971 SO libspdk_event_bdev.so.6.0 00:04:05.971 SYMLINK libspdk_event_bdev.so 00:04:06.540 CC module/event/subsystems/scsi/scsi.o 00:04:06.540 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:06.540 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:06.540 CC module/event/subsystems/nbd/nbd.o 00:04:06.540 CC module/event/subsystems/ublk/ublk.o 00:04:06.540 LIB libspdk_event_scsi.a 00:04:06.540 LIB libspdk_event_nbd.a 00:04:06.540 LIB libspdk_event_ublk.a 00:04:06.540 SO libspdk_event_scsi.so.6.0 00:04:06.540 SO libspdk_event_nbd.so.6.0 00:04:06.800 SO libspdk_event_ublk.so.3.0 00:04:06.800 LIB libspdk_event_nvmf.a 00:04:06.800 SYMLINK libspdk_event_scsi.so 00:04:06.800 SYMLINK libspdk_event_nbd.so 00:04:06.800 SYMLINK libspdk_event_ublk.so 00:04:06.800 SO libspdk_event_nvmf.so.6.0 00:04:06.800 SYMLINK libspdk_event_nvmf.so 00:04:07.059 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:07.059 CC module/event/subsystems/iscsi/iscsi.o 00:04:07.318 LIB libspdk_event_iscsi.a 00:04:07.318 LIB libspdk_event_vhost_scsi.a 00:04:07.318 SO libspdk_event_iscsi.so.6.0 00:04:07.318 SO libspdk_event_vhost_scsi.so.3.0 00:04:07.318 SYMLINK libspdk_event_iscsi.so 00:04:07.318 SYMLINK libspdk_event_vhost_scsi.so 00:04:07.578 SO libspdk.so.6.0 00:04:07.578 SYMLINK libspdk.so 00:04:07.837 CC app/trace_record/trace_record.o 00:04:07.837 CC app/spdk_nvme_identify/identify.o 00:04:07.837 CC test/rpc_client/rpc_client_test.o 00:04:07.837 CC app/spdk_nvme_perf/perf.o 00:04:07.837 CXX app/trace/trace.o 00:04:07.837 CC app/spdk_nvme_discover/discovery_aer.o 00:04:07.837 CC app/spdk_lspci/spdk_lspci.o 00:04:07.837 TEST_HEADER include/spdk/accel_module.h 00:04:07.837 TEST_HEADER include/spdk/accel.h 00:04:07.837 TEST_HEADER include/spdk/assert.h 00:04:07.837 TEST_HEADER include/spdk/barrier.h 00:04:07.837 TEST_HEADER include/spdk/bdev.h 00:04:07.837 TEST_HEADER include/spdk/base64.h 00:04:07.837 TEST_HEADER include/spdk/bdev_module.h 00:04:07.837 TEST_HEADER include/spdk/bit_pool.h 00:04:07.837 TEST_HEADER include/spdk/bit_array.h 00:04:07.837 TEST_HEADER include/spdk/bdev_zone.h 00:04:07.837 TEST_HEADER include/spdk/blob_bdev.h 00:04:07.837 CC app/spdk_top/spdk_top.o 00:04:07.837 TEST_HEADER include/spdk/blob.h 00:04:07.837 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:07.837 TEST_HEADER include/spdk/blobfs.h 00:04:07.837 TEST_HEADER include/spdk/conf.h 00:04:07.837 TEST_HEADER include/spdk/cpuset.h 00:04:07.837 TEST_HEADER include/spdk/config.h 00:04:07.837 TEST_HEADER include/spdk/crc64.h 00:04:07.837 TEST_HEADER include/spdk/crc16.h 00:04:07.837 TEST_HEADER include/spdk/crc32.h 00:04:07.837 TEST_HEADER include/spdk/dma.h 00:04:07.837 TEST_HEADER include/spdk/dif.h 00:04:07.837 TEST_HEADER include/spdk/env_dpdk.h 00:04:07.837 TEST_HEADER include/spdk/endian.h 00:04:07.837 TEST_HEADER include/spdk/env.h 00:04:07.837 TEST_HEADER include/spdk/fd_group.h 00:04:07.837 TEST_HEADER include/spdk/event.h 00:04:07.837 TEST_HEADER include/spdk/fd.h 00:04:07.837 TEST_HEADER include/spdk/file.h 00:04:07.837 TEST_HEADER include/spdk/ftl.h 00:04:08.112 TEST_HEADER include/spdk/gpt_spec.h 00:04:08.112 TEST_HEADER include/spdk/idxd.h 00:04:08.112 TEST_HEADER include/spdk/histogram_data.h 00:04:08.112 TEST_HEADER include/spdk/hexlify.h 00:04:08.112 TEST_HEADER include/spdk/init.h 00:04:08.112 TEST_HEADER include/spdk/ioat.h 00:04:08.112 TEST_HEADER include/spdk/idxd_spec.h 00:04:08.112 TEST_HEADER include/spdk/ioat_spec.h 00:04:08.112 TEST_HEADER include/spdk/iscsi_spec.h 00:04:08.112 CC app/iscsi_tgt/iscsi_tgt.o 00:04:08.112 TEST_HEADER include/spdk/json.h 00:04:08.112 TEST_HEADER include/spdk/jsonrpc.h 00:04:08.112 TEST_HEADER include/spdk/keyring.h 00:04:08.112 TEST_HEADER include/spdk/likely.h 00:04:08.112 CC app/nvmf_tgt/nvmf_main.o 00:04:08.112 TEST_HEADER include/spdk/keyring_module.h 00:04:08.112 TEST_HEADER include/spdk/lvol.h 00:04:08.112 TEST_HEADER include/spdk/log.h 00:04:08.112 TEST_HEADER include/spdk/memory.h 00:04:08.112 CC app/spdk_dd/spdk_dd.o 00:04:08.112 TEST_HEADER include/spdk/mmio.h 00:04:08.112 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:08.112 TEST_HEADER include/spdk/net.h 00:04:08.112 TEST_HEADER include/spdk/nbd.h 00:04:08.112 TEST_HEADER include/spdk/notify.h 00:04:08.112 TEST_HEADER include/spdk/nvme_intel.h 00:04:08.112 TEST_HEADER include/spdk/nvme.h 00:04:08.112 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:08.112 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:08.112 TEST_HEADER include/spdk/nvme_zns.h 00:04:08.112 TEST_HEADER include/spdk/nvme_spec.h 00:04:08.112 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:08.112 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:08.112 TEST_HEADER include/spdk/nvmf_spec.h 00:04:08.112 TEST_HEADER include/spdk/nvmf.h 00:04:08.112 TEST_HEADER include/spdk/nvmf_transport.h 00:04:08.112 TEST_HEADER include/spdk/opal.h 00:04:08.112 TEST_HEADER include/spdk/pci_ids.h 00:04:08.112 TEST_HEADER include/spdk/opal_spec.h 00:04:08.112 TEST_HEADER include/spdk/queue.h 00:04:08.112 TEST_HEADER include/spdk/rpc.h 00:04:08.112 TEST_HEADER include/spdk/pipe.h 00:04:08.112 TEST_HEADER include/spdk/scsi.h 00:04:08.112 TEST_HEADER include/spdk/reduce.h 00:04:08.112 TEST_HEADER include/spdk/scheduler.h 00:04:08.112 TEST_HEADER include/spdk/scsi_spec.h 00:04:08.112 TEST_HEADER include/spdk/sock.h 00:04:08.112 CC app/spdk_tgt/spdk_tgt.o 00:04:08.112 TEST_HEADER include/spdk/stdinc.h 00:04:08.112 TEST_HEADER include/spdk/string.h 00:04:08.112 TEST_HEADER include/spdk/trace.h 00:04:08.112 TEST_HEADER include/spdk/thread.h 00:04:08.112 TEST_HEADER include/spdk/tree.h 00:04:08.112 TEST_HEADER include/spdk/trace_parser.h 00:04:08.112 TEST_HEADER include/spdk/ublk.h 00:04:08.112 TEST_HEADER include/spdk/uuid.h 00:04:08.112 TEST_HEADER include/spdk/version.h 00:04:08.112 TEST_HEADER include/spdk/util.h 00:04:08.112 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:08.112 TEST_HEADER include/spdk/vhost.h 00:04:08.112 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:08.112 TEST_HEADER include/spdk/vmd.h 00:04:08.112 TEST_HEADER include/spdk/xor.h 00:04:08.112 TEST_HEADER include/spdk/zipf.h 00:04:08.112 CXX test/cpp_headers/accel_module.o 00:04:08.112 CXX test/cpp_headers/assert.o 00:04:08.112 CXX test/cpp_headers/accel.o 00:04:08.112 CXX test/cpp_headers/barrier.o 00:04:08.112 CXX test/cpp_headers/bdev.o 00:04:08.112 CXX test/cpp_headers/bdev_module.o 00:04:08.112 CXX test/cpp_headers/base64.o 00:04:08.112 CXX test/cpp_headers/bit_pool.o 00:04:08.112 CXX test/cpp_headers/blob_bdev.o 00:04:08.112 CXX test/cpp_headers/bdev_zone.o 00:04:08.112 CXX test/cpp_headers/bit_array.o 00:04:08.112 CXX test/cpp_headers/blobfs_bdev.o 00:04:08.112 CXX test/cpp_headers/blobfs.o 00:04:08.112 CXX test/cpp_headers/blob.o 00:04:08.112 CXX test/cpp_headers/conf.o 00:04:08.112 CXX test/cpp_headers/config.o 00:04:08.112 CXX test/cpp_headers/crc16.o 00:04:08.112 CXX test/cpp_headers/cpuset.o 00:04:08.112 CXX test/cpp_headers/crc32.o 00:04:08.112 CXX test/cpp_headers/dif.o 00:04:08.112 CXX test/cpp_headers/crc64.o 00:04:08.112 CXX test/cpp_headers/endian.o 00:04:08.112 CXX test/cpp_headers/env_dpdk.o 00:04:08.112 CXX test/cpp_headers/dma.o 00:04:08.112 CXX test/cpp_headers/event.o 00:04:08.112 CXX test/cpp_headers/env.o 00:04:08.112 CXX test/cpp_headers/fd_group.o 00:04:08.112 CXX test/cpp_headers/file.o 00:04:08.112 CXX test/cpp_headers/gpt_spec.o 00:04:08.112 CXX test/cpp_headers/fd.o 00:04:08.112 CXX test/cpp_headers/ftl.o 00:04:08.112 CXX test/cpp_headers/hexlify.o 00:04:08.112 CXX test/cpp_headers/histogram_data.o 00:04:08.112 CXX test/cpp_headers/idxd_spec.o 00:04:08.112 CXX test/cpp_headers/idxd.o 00:04:08.112 CXX test/cpp_headers/init.o 00:04:08.112 CXX test/cpp_headers/ioat_spec.o 00:04:08.112 CXX test/cpp_headers/ioat.o 00:04:08.112 CXX test/cpp_headers/iscsi_spec.o 00:04:08.112 CXX test/cpp_headers/json.o 00:04:08.112 CXX test/cpp_headers/jsonrpc.o 00:04:08.112 CXX test/cpp_headers/keyring_module.o 00:04:08.112 CXX test/cpp_headers/keyring.o 00:04:08.112 CXX test/cpp_headers/likely.o 00:04:08.112 CXX test/cpp_headers/lvol.o 00:04:08.112 CXX test/cpp_headers/log.o 00:04:08.112 CXX test/cpp_headers/memory.o 00:04:08.112 CXX test/cpp_headers/mmio.o 00:04:08.112 CXX test/cpp_headers/nbd.o 00:04:08.112 CXX test/cpp_headers/notify.o 00:04:08.112 CXX test/cpp_headers/net.o 00:04:08.112 CXX test/cpp_headers/nvme.o 00:04:08.112 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:08.112 CXX test/cpp_headers/nvme_intel.o 00:04:08.112 CXX test/cpp_headers/nvme_ocssd.o 00:04:08.112 CXX test/cpp_headers/nvme_zns.o 00:04:08.112 CXX test/cpp_headers/nvme_spec.o 00:04:08.113 CXX test/cpp_headers/nvmf_cmd.o 00:04:08.113 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:08.113 CXX test/cpp_headers/nvmf.o 00:04:08.113 CXX test/cpp_headers/nvmf_spec.o 00:04:08.113 CXX test/cpp_headers/nvmf_transport.o 00:04:08.113 CXX test/cpp_headers/opal.o 00:04:08.113 CXX test/cpp_headers/opal_spec.o 00:04:08.113 CXX test/cpp_headers/pci_ids.o 00:04:08.113 CXX test/cpp_headers/pipe.o 00:04:08.113 CXX test/cpp_headers/queue.o 00:04:08.113 CXX test/cpp_headers/rpc.o 00:04:08.113 CXX test/cpp_headers/reduce.o 00:04:08.113 CXX test/cpp_headers/scheduler.o 00:04:08.113 CC examples/ioat/verify/verify.o 00:04:08.113 CXX test/cpp_headers/scsi.o 00:04:08.113 CXX test/cpp_headers/scsi_spec.o 00:04:08.113 CXX test/cpp_headers/sock.o 00:04:08.113 CXX test/cpp_headers/stdinc.o 00:04:08.113 CXX test/cpp_headers/string.o 00:04:08.113 CXX test/cpp_headers/thread.o 00:04:08.113 CC app/fio/nvme/fio_plugin.o 00:04:08.113 CC examples/util/zipf/zipf.o 00:04:08.113 CXX test/cpp_headers/trace_parser.o 00:04:08.113 CXX test/cpp_headers/trace.o 00:04:08.113 CXX test/cpp_headers/tree.o 00:04:08.113 CXX test/cpp_headers/ublk.o 00:04:08.113 CXX test/cpp_headers/util.o 00:04:08.113 CC test/thread/poller_perf/poller_perf.o 00:04:08.113 CC examples/ioat/perf/perf.o 00:04:08.113 CC test/app/jsoncat/jsoncat.o 00:04:08.113 CC test/app/stub/stub.o 00:04:08.113 CXX test/cpp_headers/uuid.o 00:04:08.113 CC test/app/histogram_perf/histogram_perf.o 00:04:08.113 CC test/env/memory/memory_ut.o 00:04:08.390 CXX test/cpp_headers/version.o 00:04:08.390 CC test/env/pci/pci_ut.o 00:04:08.390 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:08.390 CC test/app/bdev_svc/bdev_svc.o 00:04:08.390 CC app/fio/bdev/fio_plugin.o 00:04:08.390 CC test/env/vtophys/vtophys.o 00:04:08.390 CXX test/cpp_headers/vfio_user_pci.o 00:04:08.390 LINK spdk_lspci 00:04:08.390 CC test/dma/test_dma/test_dma.o 00:04:08.390 CXX test/cpp_headers/vfio_user_spec.o 00:04:08.390 LINK rpc_client_test 00:04:08.687 LINK spdk_nvme_discover 00:04:08.687 LINK nvmf_tgt 00:04:08.687 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:08.956 LINK interrupt_tgt 00:04:08.956 LINK spdk_trace_record 00:04:08.956 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:08.957 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:08.957 CC test/env/mem_callbacks/mem_callbacks.o 00:04:08.957 LINK jsoncat 00:04:08.957 LINK histogram_perf 00:04:08.957 LINK poller_perf 00:04:08.957 LINK iscsi_tgt 00:04:08.957 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:08.957 LINK zipf 00:04:08.957 CXX test/cpp_headers/vhost.o 00:04:08.957 CXX test/cpp_headers/vmd.o 00:04:08.957 CXX test/cpp_headers/xor.o 00:04:08.957 CXX test/cpp_headers/zipf.o 00:04:08.957 LINK stub 00:04:08.957 LINK env_dpdk_post_init 00:04:08.957 LINK spdk_tgt 00:04:08.957 LINK verify 00:04:08.957 LINK vtophys 00:04:08.957 LINK ioat_perf 00:04:08.957 LINK bdev_svc 00:04:09.214 LINK spdk_dd 00:04:09.214 LINK spdk_trace 00:04:09.214 LINK pci_ut 00:04:09.214 LINK test_dma 00:04:09.472 LINK spdk_bdev 00:04:09.472 LINK nvme_fuzz 00:04:09.472 LINK spdk_nvme 00:04:09.472 LINK vhost_fuzz 00:04:09.472 LINK mem_callbacks 00:04:09.472 LINK spdk_nvme_identify 00:04:09.472 LINK spdk_nvme_perf 00:04:09.472 CC test/event/event_perf/event_perf.o 00:04:09.472 CC test/event/reactor_perf/reactor_perf.o 00:04:09.472 CC test/event/reactor/reactor.o 00:04:09.472 CC examples/sock/hello_world/hello_sock.o 00:04:09.472 CC examples/idxd/perf/perf.o 00:04:09.472 CC examples/vmd/lsvmd/lsvmd.o 00:04:09.472 CC examples/vmd/led/led.o 00:04:09.472 CC test/event/scheduler/scheduler.o 00:04:09.472 CC test/event/app_repeat/app_repeat.o 00:04:09.472 CC examples/thread/thread/thread_ex.o 00:04:09.731 CC app/vhost/vhost.o 00:04:09.731 LINK spdk_top 00:04:09.731 LINK reactor_perf 00:04:09.731 LINK reactor 00:04:09.731 LINK event_perf 00:04:09.731 LINK lsvmd 00:04:09.731 LINK led 00:04:09.731 LINK memory_ut 00:04:09.731 LINK app_repeat 00:04:09.731 LINK hello_sock 00:04:09.731 LINK vhost 00:04:09.989 CC test/nvme/reset/reset.o 00:04:09.989 LINK scheduler 00:04:09.989 CC test/nvme/boot_partition/boot_partition.o 00:04:09.989 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:09.989 LINK thread 00:04:09.989 CC test/nvme/reserve/reserve.o 00:04:09.989 CC test/nvme/aer/aer.o 00:04:09.989 CC test/nvme/overhead/overhead.o 00:04:09.989 CC test/nvme/fdp/fdp.o 00:04:09.989 CC test/nvme/compliance/nvme_compliance.o 00:04:09.989 CC test/nvme/startup/startup.o 00:04:09.989 CC test/nvme/connect_stress/connect_stress.o 00:04:09.989 CC test/nvme/cuse/cuse.o 00:04:09.989 CC test/nvme/err_injection/err_injection.o 00:04:09.989 CC test/nvme/e2edp/nvme_dp.o 00:04:09.989 CC test/nvme/fused_ordering/fused_ordering.o 00:04:09.989 CC test/nvme/sgl/sgl.o 00:04:09.989 CC test/nvme/simple_copy/simple_copy.o 00:04:09.989 LINK idxd_perf 00:04:09.989 CC test/blobfs/mkfs/mkfs.o 00:04:09.989 CC test/accel/dif/dif.o 00:04:09.989 CC test/lvol/esnap/esnap.o 00:04:09.989 LINK boot_partition 00:04:09.989 LINK err_injection 00:04:09.989 LINK connect_stress 00:04:09.989 LINK startup 00:04:10.247 LINK doorbell_aers 00:04:10.247 LINK fused_ordering 00:04:10.247 LINK nvme_dp 00:04:10.247 LINK reserve 00:04:10.247 LINK mkfs 00:04:10.247 LINK reset 00:04:10.247 LINK nvme_compliance 00:04:10.247 LINK simple_copy 00:04:10.247 LINK aer 00:04:10.247 LINK overhead 00:04:10.247 LINK sgl 00:04:10.247 LINK fdp 00:04:10.247 LINK iscsi_fuzz 00:04:10.247 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:10.247 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:10.247 CC examples/nvme/reconnect/reconnect.o 00:04:10.247 CC examples/nvme/abort/abort.o 00:04:10.247 CC examples/nvme/hello_world/hello_world.o 00:04:10.247 CC examples/nvme/arbitration/arbitration.o 00:04:10.247 CC examples/nvme/hotplug/hotplug.o 00:04:10.247 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:10.505 LINK dif 00:04:10.505 CC examples/accel/perf/accel_perf.o 00:04:10.505 CC examples/blob/hello_world/hello_blob.o 00:04:10.505 CC examples/blob/cli/blobcli.o 00:04:10.505 LINK cmb_copy 00:04:10.505 LINK pmr_persistence 00:04:10.505 LINK hello_world 00:04:10.505 LINK hotplug 00:04:10.763 LINK reconnect 00:04:10.763 LINK arbitration 00:04:10.763 LINK abort 00:04:10.763 LINK hello_blob 00:04:10.763 LINK nvme_manage 00:04:11.024 LINK accel_perf 00:04:11.024 LINK blobcli 00:04:11.024 CC test/bdev/bdevio/bdevio.o 00:04:11.024 LINK cuse 00:04:11.590 LINK bdevio 00:04:11.590 CC examples/bdev/hello_world/hello_bdev.o 00:04:11.590 CC examples/bdev/bdevperf/bdevperf.o 00:04:11.849 LINK hello_bdev 00:04:12.415 LINK bdevperf 00:04:12.981 CC examples/nvmf/nvmf/nvmf.o 00:04:13.239 LINK nvmf 00:04:14.616 LINK esnap 00:04:15.198 00:04:15.198 real 1m25.488s 00:04:15.198 user 15m31.022s 00:04:15.198 sys 5m30.510s 00:04:15.198 07:09:47 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:04:15.198 07:09:47 make -- common/autotest_common.sh@10 -- $ set +x 00:04:15.198 ************************************ 00:04:15.198 END TEST make 00:04:15.198 ************************************ 00:04:15.198 07:09:47 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:15.198 07:09:47 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:15.198 07:09:47 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:15.198 07:09:47 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.198 07:09:47 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:15.198 07:09:47 -- pm/common@44 -- $ pid=1399823 00:04:15.198 07:09:47 -- pm/common@50 -- $ kill -TERM 1399823 00:04:15.198 07:09:47 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.198 07:09:47 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:15.198 07:09:47 -- pm/common@44 -- $ pid=1399825 00:04:15.198 07:09:47 -- pm/common@50 -- $ kill -TERM 1399825 00:04:15.198 07:09:47 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.198 07:09:47 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:15.198 07:09:47 -- pm/common@44 -- $ pid=1399827 00:04:15.198 07:09:47 -- pm/common@50 -- $ kill -TERM 1399827 00:04:15.198 07:09:47 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.198 07:09:47 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:15.198 07:09:47 -- pm/common@44 -- $ pid=1399849 00:04:15.198 07:09:47 -- pm/common@50 -- $ sudo -E kill -TERM 1399849 00:04:15.198 07:09:47 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:04:15.198 07:09:47 -- nvmf/common.sh@7 -- # uname -s 00:04:15.198 07:09:47 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:15.198 07:09:47 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:15.198 07:09:47 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:15.198 07:09:47 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:15.198 07:09:47 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:15.198 07:09:47 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:15.198 07:09:47 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:15.198 07:09:47 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:15.198 07:09:47 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:15.198 07:09:47 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:15.198 07:09:47 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:04:15.198 07:09:47 -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:04:15.198 07:09:47 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:15.198 07:09:47 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:15.198 07:09:47 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:15.198 07:09:47 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:15.198 07:09:47 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:04:15.198 07:09:47 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:15.198 07:09:47 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:15.198 07:09:47 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:15.198 07:09:47 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:15.198 07:09:47 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:15.199 07:09:47 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:15.199 07:09:47 -- paths/export.sh@5 -- # export PATH 00:04:15.199 07:09:47 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:15.199 07:09:47 -- nvmf/common.sh@47 -- # : 0 00:04:15.199 07:09:47 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:04:15.199 07:09:47 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:04:15.199 07:09:47 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:15.199 07:09:47 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:15.199 07:09:47 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:15.199 07:09:47 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:04:15.199 07:09:47 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:04:15.199 07:09:47 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:04:15.199 07:09:47 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:15.199 07:09:47 -- spdk/autotest.sh@32 -- # uname -s 00:04:15.199 07:09:47 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:15.199 07:09:47 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:15.199 07:09:47 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:15.199 07:09:47 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:15.199 07:09:47 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:15.199 07:09:47 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:15.199 07:09:47 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:15.199 07:09:47 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:15.199 07:09:47 -- spdk/autotest.sh@48 -- # udevadm_pid=1470809 00:04:15.199 07:09:47 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:15.199 07:09:47 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:15.199 07:09:47 -- pm/common@17 -- # local monitor 00:04:15.199 07:09:47 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.199 07:09:47 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.199 07:09:47 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.199 07:09:47 -- pm/common@21 -- # date +%s 00:04:15.199 07:09:47 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:15.199 07:09:47 -- pm/common@21 -- # date +%s 00:04:15.199 07:09:47 -- pm/common@25 -- # sleep 1 00:04:15.199 07:09:47 -- pm/common@21 -- # date +%s 00:04:15.199 07:09:47 -- pm/common@21 -- # date +%s 00:04:15.199 07:09:47 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721884187 00:04:15.199 07:09:47 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721884187 00:04:15.199 07:09:47 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721884187 00:04:15.199 07:09:47 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721884187 00:04:15.456 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721884187_collect-cpu-temp.pm.log 00:04:15.456 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721884187_collect-vmstat.pm.log 00:04:15.456 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721884187_collect-cpu-load.pm.log 00:04:15.456 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721884187_collect-bmc-pm.bmc.pm.log 00:04:16.392 07:09:48 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:16.392 07:09:48 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:16.392 07:09:48 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:16.392 07:09:48 -- common/autotest_common.sh@10 -- # set +x 00:04:16.392 07:09:48 -- spdk/autotest.sh@59 -- # create_test_list 00:04:16.392 07:09:48 -- common/autotest_common.sh@748 -- # xtrace_disable 00:04:16.392 07:09:48 -- common/autotest_common.sh@10 -- # set +x 00:04:16.392 07:09:48 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:04:16.392 07:09:48 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:16.392 07:09:48 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:16.392 07:09:48 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:04:16.392 07:09:48 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:16.392 07:09:48 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:16.392 07:09:48 -- common/autotest_common.sh@1455 -- # uname 00:04:16.392 07:09:48 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:04:16.392 07:09:48 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:16.392 07:09:48 -- common/autotest_common.sh@1475 -- # uname 00:04:16.392 07:09:48 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:04:16.392 07:09:48 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:04:16.392 07:09:48 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:04:16.392 07:09:48 -- spdk/autotest.sh@72 -- # hash lcov 00:04:16.392 07:09:48 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:04:16.392 07:09:48 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:04:16.392 --rc lcov_branch_coverage=1 00:04:16.392 --rc lcov_function_coverage=1 00:04:16.392 --rc genhtml_branch_coverage=1 00:04:16.392 --rc genhtml_function_coverage=1 00:04:16.392 --rc genhtml_legend=1 00:04:16.392 --rc geninfo_all_blocks=1 00:04:16.392 ' 00:04:16.392 07:09:48 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:04:16.392 --rc lcov_branch_coverage=1 00:04:16.392 --rc lcov_function_coverage=1 00:04:16.392 --rc genhtml_branch_coverage=1 00:04:16.392 --rc genhtml_function_coverage=1 00:04:16.392 --rc genhtml_legend=1 00:04:16.392 --rc geninfo_all_blocks=1 00:04:16.392 ' 00:04:16.392 07:09:48 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:04:16.392 --rc lcov_branch_coverage=1 00:04:16.392 --rc lcov_function_coverage=1 00:04:16.392 --rc genhtml_branch_coverage=1 00:04:16.392 --rc genhtml_function_coverage=1 00:04:16.392 --rc genhtml_legend=1 00:04:16.392 --rc geninfo_all_blocks=1 00:04:16.392 --no-external' 00:04:16.392 07:09:48 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:04:16.392 --rc lcov_branch_coverage=1 00:04:16.392 --rc lcov_function_coverage=1 00:04:16.392 --rc genhtml_branch_coverage=1 00:04:16.392 --rc genhtml_function_coverage=1 00:04:16.392 --rc genhtml_legend=1 00:04:16.392 --rc geninfo_all_blocks=1 00:04:16.392 --no-external' 00:04:16.392 07:09:48 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:04:16.392 lcov: LCOV version 1.14 00:04:16.392 07:09:48 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:04:18.296 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:04:18.296 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:04:18.297 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:04:18.297 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:04:18.556 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:04:18.556 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:04:18.815 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:04:18.815 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:04:18.815 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:04:18.815 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:04:18.815 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:04:18.815 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:04:18.815 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:04:18.815 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:04:18.815 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:04:18.816 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:04:18.816 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:04:33.708 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:04:33.708 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:51.837 07:10:21 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:04:51.837 07:10:21 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:51.837 07:10:21 -- common/autotest_common.sh@10 -- # set +x 00:04:51.837 07:10:21 -- spdk/autotest.sh@91 -- # rm -f 00:04:51.837 07:10:21 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:53.214 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:53.215 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:53.215 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:53.473 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:53.731 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:53.731 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:53.731 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:53.731 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:53.731 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:53.731 07:10:26 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:53.731 07:10:26 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:53.731 07:10:26 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:53.731 07:10:26 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:53.731 07:10:26 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:53.731 07:10:26 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:53.731 07:10:26 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:53.731 07:10:26 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:53.731 07:10:26 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:53.731 07:10:26 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:53.731 07:10:26 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:53.731 07:10:26 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:53.731 07:10:26 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:53.732 07:10:26 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:53.732 07:10:26 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:53.732 No valid GPT data, bailing 00:04:53.732 07:10:26 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:53.732 07:10:26 -- scripts/common.sh@391 -- # pt= 00:04:53.732 07:10:26 -- scripts/common.sh@392 -- # return 1 00:04:53.732 07:10:26 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:53.732 1+0 records in 00:04:53.732 1+0 records out 00:04:53.732 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00501789 s, 209 MB/s 00:04:53.732 07:10:26 -- spdk/autotest.sh@118 -- # sync 00:04:53.732 07:10:26 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:53.732 07:10:26 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:53.732 07:10:26 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:05:01.852 07:10:33 -- spdk/autotest.sh@124 -- # uname -s 00:05:01.852 07:10:33 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:05:01.852 07:10:33 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:05:01.852 07:10:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:01.852 07:10:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:01.852 07:10:33 -- common/autotest_common.sh@10 -- # set +x 00:05:01.852 ************************************ 00:05:01.852 START TEST setup.sh 00:05:01.852 ************************************ 00:05:01.852 07:10:33 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:05:01.852 * Looking for test storage... 00:05:01.852 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:01.852 07:10:33 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:05:01.852 07:10:33 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:05:01.852 07:10:33 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:05:01.852 07:10:33 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:01.852 07:10:33 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:01.852 07:10:33 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:01.852 ************************************ 00:05:01.852 START TEST acl 00:05:01.852 ************************************ 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:05:01.852 * Looking for test storage... 00:05:01.852 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:01.852 07:10:33 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:01.852 07:10:33 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:01.852 07:10:33 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:05:01.852 07:10:33 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:05:01.852 07:10:33 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:05:01.852 07:10:33 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:05:01.852 07:10:33 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:05:01.852 07:10:33 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:01.852 07:10:33 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:06.041 07:10:38 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:05:06.041 07:10:38 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:05:06.041 07:10:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.041 07:10:38 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:05:06.041 07:10:38 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.041 07:10:38 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:10.236 Hugepages 00:05:10.236 node hugesize free / total 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 00:05:10.236 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:05:10.236 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:05:10.237 07:10:42 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:05:10.237 07:10:42 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:10.237 07:10:42 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:10.237 07:10:42 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:10.237 ************************************ 00:05:10.237 START TEST denied 00:05:10.237 ************************************ 00:05:10.237 07:10:42 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:05:10.237 07:10:42 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:05:10.237 07:10:42 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:05:10.237 07:10:42 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:05:10.237 07:10:42 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.237 07:10:42 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:14.432 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:14.432 07:10:46 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:19.708 00:05:19.708 real 0m9.710s 00:05:19.708 user 0m3.074s 00:05:19.708 sys 0m5.929s 00:05:19.708 07:10:52 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:19.708 07:10:52 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:19.708 ************************************ 00:05:19.708 END TEST denied 00:05:19.708 ************************************ 00:05:19.708 07:10:52 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:19.708 07:10:52 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:19.708 07:10:52 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:19.708 07:10:52 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:19.708 ************************************ 00:05:19.708 START TEST allowed 00:05:19.708 ************************************ 00:05:19.708 07:10:52 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:05:19.708 07:10:52 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:05:19.708 07:10:52 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:19.708 07:10:52 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:05:19.708 07:10:52 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:19.708 07:10:52 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:26.280 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:26.280 07:10:58 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:26.280 07:10:58 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:26.280 07:10:58 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:26.280 07:10:58 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:26.280 07:10:58 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:30.477 00:05:30.477 real 0m10.693s 00:05:30.477 user 0m2.952s 00:05:30.477 sys 0m5.920s 00:05:30.477 07:11:02 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:30.477 07:11:02 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:30.477 ************************************ 00:05:30.477 END TEST allowed 00:05:30.477 ************************************ 00:05:30.477 00:05:30.477 real 0m29.444s 00:05:30.477 user 0m9.282s 00:05:30.477 sys 0m17.951s 00:05:30.477 07:11:02 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:30.477 07:11:02 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:30.477 ************************************ 00:05:30.477 END TEST acl 00:05:30.477 ************************************ 00:05:30.477 07:11:02 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:30.477 07:11:02 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:30.477 07:11:02 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:30.477 07:11:02 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:30.477 ************************************ 00:05:30.477 START TEST hugepages 00:05:30.477 ************************************ 00:05:30.477 07:11:03 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:30.736 * Looking for test storage... 00:05:30.736 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41383048 kB' 'MemAvailable: 45373880 kB' 'Buffers: 6064 kB' 'Cached: 10568472 kB' 'SwapCached: 0 kB' 'Active: 7393024 kB' 'Inactive: 3689560 kB' 'Active(anon): 6994600 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511140 kB' 'Mapped: 201208 kB' 'Shmem: 6486552 kB' 'KReclaimable: 550556 kB' 'Slab: 1203068 kB' 'SReclaimable: 550556 kB' 'SUnreclaim: 652512 kB' 'KernelStack: 22432 kB' 'PageTables: 9144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439060 kB' 'Committed_AS: 8472088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218764 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.736 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.737 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:30.738 07:11:03 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:05:30.738 07:11:03 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:30.738 07:11:03 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:30.738 07:11:03 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:30.738 ************************************ 00:05:30.738 START TEST default_setup 00:05:30.738 ************************************ 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # default_setup 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:30.738 07:11:03 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:34.955 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:34.955 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:36.860 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.129 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43521392 kB' 'MemAvailable: 47511672 kB' 'Buffers: 6064 kB' 'Cached: 10568636 kB' 'SwapCached: 0 kB' 'Active: 7410752 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012328 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528508 kB' 'Mapped: 202376 kB' 'Shmem: 6486716 kB' 'KReclaimable: 550004 kB' 'Slab: 1201368 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651364 kB' 'KernelStack: 22448 kB' 'PageTables: 8804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8519356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.130 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43524448 kB' 'MemAvailable: 47514728 kB' 'Buffers: 6064 kB' 'Cached: 10568636 kB' 'SwapCached: 0 kB' 'Active: 7410664 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012240 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528812 kB' 'Mapped: 202288 kB' 'Shmem: 6486716 kB' 'KReclaimable: 550004 kB' 'Slab: 1201668 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651664 kB' 'KernelStack: 22192 kB' 'PageTables: 9160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8520984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.131 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.132 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43524300 kB' 'MemAvailable: 47514580 kB' 'Buffers: 6064 kB' 'Cached: 10568656 kB' 'SwapCached: 0 kB' 'Active: 7411036 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012612 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529268 kB' 'Mapped: 202288 kB' 'Shmem: 6486736 kB' 'KReclaimable: 550004 kB' 'Slab: 1201660 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651656 kB' 'KernelStack: 22480 kB' 'PageTables: 9544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8519396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218860 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.133 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.134 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:37.135 nr_hugepages=1024 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:37.135 resv_hugepages=0 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:37.135 surplus_hugepages=0 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:37.135 anon_hugepages=0 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43524276 kB' 'MemAvailable: 47514556 kB' 'Buffers: 6064 kB' 'Cached: 10568676 kB' 'SwapCached: 0 kB' 'Active: 7411532 kB' 'Inactive: 3689560 kB' 'Active(anon): 7013108 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529704 kB' 'Mapped: 202288 kB' 'Shmem: 6486756 kB' 'KReclaimable: 550004 kB' 'Slab: 1202196 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 652192 kB' 'KernelStack: 22592 kB' 'PageTables: 9864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8521028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218860 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.135 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.136 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25842836 kB' 'MemUsed: 6796304 kB' 'SwapCached: 0 kB' 'Active: 2868472 kB' 'Inactive: 231284 kB' 'Active(anon): 2735424 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728428 kB' 'Mapped: 99596 kB' 'AnonPages: 374504 kB' 'Shmem: 2364096 kB' 'KernelStack: 12456 kB' 'PageTables: 6272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220284 kB' 'Slab: 523344 kB' 'SReclaimable: 220284 kB' 'SUnreclaim: 303060 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.137 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.138 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:37.399 node0=1024 expecting 1024 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:37.399 00:05:37.399 real 0m6.431s 00:05:37.399 user 0m1.726s 00:05:37.399 sys 0m2.839s 00:05:37.399 07:11:09 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:37.400 07:11:09 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:05:37.400 ************************************ 00:05:37.400 END TEST default_setup 00:05:37.400 ************************************ 00:05:37.400 07:11:09 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:05:37.400 07:11:09 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:37.400 07:11:09 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:37.400 07:11:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:37.400 ************************************ 00:05:37.400 START TEST per_node_1G_alloc 00:05:37.400 ************************************ 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # per_node_1G_alloc 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:37.400 07:11:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:41.603 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:41.603 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.603 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43518596 kB' 'MemAvailable: 47508876 kB' 'Buffers: 6064 kB' 'Cached: 10568784 kB' 'SwapCached: 0 kB' 'Active: 7409304 kB' 'Inactive: 3689560 kB' 'Active(anon): 7010880 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527292 kB' 'Mapped: 201120 kB' 'Shmem: 6486864 kB' 'KReclaimable: 550004 kB' 'Slab: 1201160 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651156 kB' 'KernelStack: 22128 kB' 'PageTables: 8564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8511976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218796 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.604 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43519024 kB' 'MemAvailable: 47509304 kB' 'Buffers: 6064 kB' 'Cached: 10568784 kB' 'SwapCached: 0 kB' 'Active: 7409540 kB' 'Inactive: 3689560 kB' 'Active(anon): 7011116 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527556 kB' 'Mapped: 201120 kB' 'Shmem: 6486864 kB' 'KReclaimable: 550004 kB' 'Slab: 1201204 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651200 kB' 'KernelStack: 22128 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8511996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.605 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.606 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43519720 kB' 'MemAvailable: 47510000 kB' 'Buffers: 6064 kB' 'Cached: 10568804 kB' 'SwapCached: 0 kB' 'Active: 7409564 kB' 'Inactive: 3689560 kB' 'Active(anon): 7011140 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527548 kB' 'Mapped: 201120 kB' 'Shmem: 6486884 kB' 'KReclaimable: 550004 kB' 'Slab: 1201204 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651200 kB' 'KernelStack: 22128 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8512152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.607 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.608 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:41.609 nr_hugepages=1024 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:41.609 resv_hugepages=0 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:41.609 surplus_hugepages=0 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:41.609 anon_hugepages=0 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43520880 kB' 'MemAvailable: 47511160 kB' 'Buffers: 6064 kB' 'Cached: 10568840 kB' 'SwapCached: 0 kB' 'Active: 7410440 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012016 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528480 kB' 'Mapped: 201140 kB' 'Shmem: 6486920 kB' 'KReclaimable: 550004 kB' 'Slab: 1201196 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651192 kB' 'KernelStack: 22128 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8515424 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218732 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.609 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.610 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 26864052 kB' 'MemUsed: 5775088 kB' 'SwapCached: 0 kB' 'Active: 2869144 kB' 'Inactive: 231284 kB' 'Active(anon): 2736096 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728460 kB' 'Mapped: 98816 kB' 'AnonPages: 375216 kB' 'Shmem: 2364128 kB' 'KernelStack: 12232 kB' 'PageTables: 5432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220284 kB' 'Slab: 522680 kB' 'SReclaimable: 220284 kB' 'SUnreclaim: 302396 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.611 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:13 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.612 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.612 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.612 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.612 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16656628 kB' 'MemUsed: 10999452 kB' 'SwapCached: 0 kB' 'Active: 4541136 kB' 'Inactive: 3458276 kB' 'Active(anon): 4275760 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7846488 kB' 'Mapped: 102332 kB' 'AnonPages: 152972 kB' 'Shmem: 4122836 kB' 'KernelStack: 9976 kB' 'PageTables: 3032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329720 kB' 'Slab: 678516 kB' 'SReclaimable: 329720 kB' 'SUnreclaim: 348796 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.613 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:41.614 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:41.615 node0=512 expecting 512 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:41.615 node1=512 expecting 512 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:41.615 00:05:41.615 real 0m4.290s 00:05:41.615 user 0m1.612s 00:05:41.615 sys 0m2.744s 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:41.615 07:11:14 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:41.615 ************************************ 00:05:41.615 END TEST per_node_1G_alloc 00:05:41.615 ************************************ 00:05:41.615 07:11:14 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:05:41.615 07:11:14 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:41.615 07:11:14 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:41.615 07:11:14 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:41.615 ************************************ 00:05:41.615 START TEST even_2G_alloc 00:05:41.615 ************************************ 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:41.615 07:11:14 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:45.814 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:45.814 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:45.814 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:45.814 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:45.814 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:45.814 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:45.815 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43494908 kB' 'MemAvailable: 47485188 kB' 'Buffers: 6064 kB' 'Cached: 10568960 kB' 'SwapCached: 0 kB' 'Active: 7411440 kB' 'Inactive: 3689560 kB' 'Active(anon): 7013016 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528776 kB' 'Mapped: 201228 kB' 'Shmem: 6487040 kB' 'KReclaimable: 550004 kB' 'Slab: 1201868 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651864 kB' 'KernelStack: 22160 kB' 'PageTables: 8688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8513336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218924 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.815 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43495148 kB' 'MemAvailable: 47485428 kB' 'Buffers: 6064 kB' 'Cached: 10568964 kB' 'SwapCached: 0 kB' 'Active: 7410636 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012212 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528440 kB' 'Mapped: 201132 kB' 'Shmem: 6487044 kB' 'KReclaimable: 550004 kB' 'Slab: 1201852 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651848 kB' 'KernelStack: 22144 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8513352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218892 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.816 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.817 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43494644 kB' 'MemAvailable: 47484924 kB' 'Buffers: 6064 kB' 'Cached: 10568980 kB' 'SwapCached: 0 kB' 'Active: 7410624 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012200 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528440 kB' 'Mapped: 201132 kB' 'Shmem: 6487060 kB' 'KReclaimable: 550004 kB' 'Slab: 1201852 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651848 kB' 'KernelStack: 22144 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8513376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218892 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.818 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.819 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.820 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:45.821 nr_hugepages=1024 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:45.821 resv_hugepages=0 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:45.821 surplus_hugepages=0 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:45.821 anon_hugepages=0 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:45.821 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43495344 kB' 'MemAvailable: 47485624 kB' 'Buffers: 6064 kB' 'Cached: 10569000 kB' 'SwapCached: 0 kB' 'Active: 7410452 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012028 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528220 kB' 'Mapped: 201132 kB' 'Shmem: 6487080 kB' 'KReclaimable: 550004 kB' 'Slab: 1201852 kB' 'SReclaimable: 550004 kB' 'SUnreclaim: 651848 kB' 'KernelStack: 22128 kB' 'PageTables: 8568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8513396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218892 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.822 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.823 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 26842304 kB' 'MemUsed: 5796836 kB' 'SwapCached: 0 kB' 'Active: 2867476 kB' 'Inactive: 231284 kB' 'Active(anon): 2734428 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728468 kB' 'Mapped: 98828 kB' 'AnonPages: 373508 kB' 'Shmem: 2364136 kB' 'KernelStack: 12152 kB' 'PageTables: 5536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220284 kB' 'Slab: 523184 kB' 'SReclaimable: 220284 kB' 'SUnreclaim: 302900 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.824 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16653492 kB' 'MemUsed: 11002588 kB' 'SwapCached: 0 kB' 'Active: 4543236 kB' 'Inactive: 3458276 kB' 'Active(anon): 4277860 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7846640 kB' 'Mapped: 102304 kB' 'AnonPages: 154940 kB' 'Shmem: 4122988 kB' 'KernelStack: 9992 kB' 'PageTables: 3088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329720 kB' 'Slab: 678668 kB' 'SReclaimable: 329720 kB' 'SUnreclaim: 348948 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.825 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:45.826 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:46.086 node0=512 expecting 512 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:46.086 node1=512 expecting 512 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:46.086 00:05:46.086 real 0m4.226s 00:05:46.086 user 0m1.614s 00:05:46.086 sys 0m2.692s 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:46.086 07:11:18 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:46.086 ************************************ 00:05:46.086 END TEST even_2G_alloc 00:05:46.086 ************************************ 00:05:46.086 07:11:18 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:05:46.086 07:11:18 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:46.086 07:11:18 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:46.086 07:11:18 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:46.086 ************************************ 00:05:46.086 START TEST odd_alloc 00:05:46.087 ************************************ 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:46.087 07:11:18 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:50.289 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:50.289 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43513204 kB' 'MemAvailable: 47503420 kB' 'Buffers: 6064 kB' 'Cached: 10569132 kB' 'SwapCached: 0 kB' 'Active: 7410040 kB' 'Inactive: 3689560 kB' 'Active(anon): 7011616 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527580 kB' 'Mapped: 200236 kB' 'Shmem: 6487212 kB' 'KReclaimable: 549940 kB' 'Slab: 1201048 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 651108 kB' 'KernelStack: 22080 kB' 'PageTables: 8492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 8479920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218908 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.289 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.290 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43514488 kB' 'MemAvailable: 47504704 kB' 'Buffers: 6064 kB' 'Cached: 10569132 kB' 'SwapCached: 0 kB' 'Active: 7410716 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012292 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528340 kB' 'Mapped: 200716 kB' 'Shmem: 6487212 kB' 'KReclaimable: 549940 kB' 'Slab: 1201088 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 651148 kB' 'KernelStack: 22096 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 8481688 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218860 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.291 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.292 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43515204 kB' 'MemAvailable: 47505420 kB' 'Buffers: 6064 kB' 'Cached: 10569152 kB' 'SwapCached: 0 kB' 'Active: 7415228 kB' 'Inactive: 3689560 kB' 'Active(anon): 7016804 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532836 kB' 'Mapped: 200716 kB' 'Shmem: 6487232 kB' 'KReclaimable: 549940 kB' 'Slab: 1201088 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 651148 kB' 'KernelStack: 22080 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 8486076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218864 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.293 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:50.294 nr_hugepages=1025 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:50.294 resv_hugepages=0 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:50.294 surplus_hugepages=0 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:50.294 anon_hugepages=0 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:50.294 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43518344 kB' 'MemAvailable: 47508560 kB' 'Buffers: 6064 kB' 'Cached: 10569172 kB' 'SwapCached: 0 kB' 'Active: 7409540 kB' 'Inactive: 3689560 kB' 'Active(anon): 7011116 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527132 kB' 'Mapped: 200212 kB' 'Shmem: 6487252 kB' 'KReclaimable: 549940 kB' 'Slab: 1201088 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 651148 kB' 'KernelStack: 22064 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 8479980 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218860 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.295 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.296 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.558 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 26860068 kB' 'MemUsed: 5779072 kB' 'SwapCached: 0 kB' 'Active: 2867492 kB' 'Inactive: 231284 kB' 'Active(anon): 2734444 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728508 kB' 'Mapped: 98184 kB' 'AnonPages: 373536 kB' 'Shmem: 2364176 kB' 'KernelStack: 12040 kB' 'PageTables: 5332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220220 kB' 'Slab: 522676 kB' 'SReclaimable: 220220 kB' 'SUnreclaim: 302456 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.559 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16662676 kB' 'MemUsed: 10993404 kB' 'SwapCached: 0 kB' 'Active: 4542036 kB' 'Inactive: 3458276 kB' 'Active(anon): 4276660 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7846780 kB' 'Mapped: 102028 kB' 'AnonPages: 153280 kB' 'Shmem: 4123116 kB' 'KernelStack: 10008 kB' 'PageTables: 3068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329720 kB' 'Slab: 678412 kB' 'SReclaimable: 329720 kB' 'SUnreclaim: 348692 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.560 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:50.561 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:50.562 node0=512 expecting 513 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:50.562 node1=513 expecting 512 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:50.562 00:05:50.562 real 0m4.464s 00:05:50.562 user 0m1.688s 00:05:50.562 sys 0m2.860s 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:50.562 07:11:22 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:50.562 ************************************ 00:05:50.562 END TEST odd_alloc 00:05:50.562 ************************************ 00:05:50.562 07:11:22 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:50.562 07:11:22 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:50.562 07:11:22 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:50.562 07:11:22 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:50.562 ************************************ 00:05:50.562 START TEST custom_alloc 00:05:50.562 ************************************ 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:50.562 07:11:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:54.769 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:54.769 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 42486796 kB' 'MemAvailable: 46477012 kB' 'Buffers: 6064 kB' 'Cached: 10569296 kB' 'SwapCached: 0 kB' 'Active: 7410060 kB' 'Inactive: 3689560 kB' 'Active(anon): 7011636 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527544 kB' 'Mapped: 200244 kB' 'Shmem: 6487376 kB' 'KReclaimable: 549940 kB' 'Slab: 1200436 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 650496 kB' 'KernelStack: 22048 kB' 'PageTables: 8436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 8481020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218828 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:54.769 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.770 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 42486764 kB' 'MemAvailable: 46476980 kB' 'Buffers: 6064 kB' 'Cached: 10569300 kB' 'SwapCached: 0 kB' 'Active: 7410524 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012100 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527984 kB' 'Mapped: 200224 kB' 'Shmem: 6487380 kB' 'KReclaimable: 549940 kB' 'Slab: 1200428 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 650488 kB' 'KernelStack: 22080 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 8481036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218812 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.771 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.772 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 42486764 kB' 'MemAvailable: 46476980 kB' 'Buffers: 6064 kB' 'Cached: 10569320 kB' 'SwapCached: 0 kB' 'Active: 7410576 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012152 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527988 kB' 'Mapped: 200224 kB' 'Shmem: 6487400 kB' 'KReclaimable: 549940 kB' 'Slab: 1200428 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 650488 kB' 'KernelStack: 22080 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 8481060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218812 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.773 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.774 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.037 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:55.038 nr_hugepages=1536 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:55.038 resv_hugepages=0 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:55.038 surplus_hugepages=0 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:55.038 anon_hugepages=0 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.038 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 42486512 kB' 'MemAvailable: 46476728 kB' 'Buffers: 6064 kB' 'Cached: 10569320 kB' 'SwapCached: 0 kB' 'Active: 7410608 kB' 'Inactive: 3689560 kB' 'Active(anon): 7012184 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528020 kB' 'Mapped: 200224 kB' 'Shmem: 6487400 kB' 'KReclaimable: 549940 kB' 'Slab: 1200428 kB' 'SReclaimable: 549940 kB' 'SUnreclaim: 650488 kB' 'KernelStack: 22096 kB' 'PageTables: 8564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 8481080 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218812 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.039 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 26879648 kB' 'MemUsed: 5759492 kB' 'SwapCached: 0 kB' 'Active: 2868644 kB' 'Inactive: 231284 kB' 'Active(anon): 2735596 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728556 kB' 'Mapped: 98196 kB' 'AnonPages: 374564 kB' 'Shmem: 2364224 kB' 'KernelStack: 12072 kB' 'PageTables: 5424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220220 kB' 'Slab: 522116 kB' 'SReclaimable: 220220 kB' 'SUnreclaim: 301896 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.040 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.041 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 15606692 kB' 'MemUsed: 12049388 kB' 'SwapCached: 0 kB' 'Active: 4542028 kB' 'Inactive: 3458276 kB' 'Active(anon): 4276652 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7846892 kB' 'Mapped: 102028 kB' 'AnonPages: 153436 kB' 'Shmem: 4123240 kB' 'KernelStack: 10008 kB' 'PageTables: 3088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329720 kB' 'Slab: 678312 kB' 'SReclaimable: 329720 kB' 'SUnreclaim: 348592 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.042 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:55.043 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:55.044 node0=512 expecting 512 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:55.044 node1=1024 expecting 1024 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:55.044 00:05:55.044 real 0m4.445s 00:05:55.044 user 0m1.650s 00:05:55.044 sys 0m2.871s 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:55.044 07:11:27 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:55.044 ************************************ 00:05:55.044 END TEST custom_alloc 00:05:55.044 ************************************ 00:05:55.044 07:11:27 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:55.044 07:11:27 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:55.044 07:11:27 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:55.044 07:11:27 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:55.044 ************************************ 00:05:55.044 START TEST no_shrink_alloc 00:05:55.044 ************************************ 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:55.044 07:11:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:59.241 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:59.241 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43465580 kB' 'MemAvailable: 47455788 kB' 'Buffers: 6064 kB' 'Cached: 10569464 kB' 'SwapCached: 0 kB' 'Active: 7412256 kB' 'Inactive: 3689560 kB' 'Active(anon): 7013832 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529236 kB' 'Mapped: 200324 kB' 'Shmem: 6487544 kB' 'KReclaimable: 549932 kB' 'Slab: 1200528 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 650596 kB' 'KernelStack: 22032 kB' 'PageTables: 8308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8481468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218812 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.241 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.242 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43465720 kB' 'MemAvailable: 47455928 kB' 'Buffers: 6064 kB' 'Cached: 10569480 kB' 'SwapCached: 0 kB' 'Active: 7412492 kB' 'Inactive: 3689560 kB' 'Active(anon): 7014068 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529964 kB' 'Mapped: 200236 kB' 'Shmem: 6487560 kB' 'KReclaimable: 549932 kB' 'Slab: 1200544 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 650612 kB' 'KernelStack: 22080 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8481856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218796 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.243 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.506 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.507 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43467928 kB' 'MemAvailable: 47458136 kB' 'Buffers: 6064 kB' 'Cached: 10569496 kB' 'SwapCached: 0 kB' 'Active: 7412212 kB' 'Inactive: 3689560 kB' 'Active(anon): 7013788 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529624 kB' 'Mapped: 200236 kB' 'Shmem: 6487576 kB' 'KReclaimable: 549932 kB' 'Slab: 1200544 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 650612 kB' 'KernelStack: 22080 kB' 'PageTables: 8520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8481876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218796 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.508 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:59.509 nr_hugepages=1024 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:59.509 resv_hugepages=0 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:59.509 surplus_hugepages=0 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:59.509 anon_hugepages=0 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.509 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43467928 kB' 'MemAvailable: 47458136 kB' 'Buffers: 6064 kB' 'Cached: 10569540 kB' 'SwapCached: 0 kB' 'Active: 7411860 kB' 'Inactive: 3689560 kB' 'Active(anon): 7013436 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529212 kB' 'Mapped: 200236 kB' 'Shmem: 6487620 kB' 'KReclaimable: 549932 kB' 'Slab: 1200544 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 650612 kB' 'KernelStack: 22064 kB' 'PageTables: 8464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8481900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218796 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.510 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:59.511 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25821428 kB' 'MemUsed: 6817712 kB' 'SwapCached: 0 kB' 'Active: 2870020 kB' 'Inactive: 231284 kB' 'Active(anon): 2736972 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728600 kB' 'Mapped: 98204 kB' 'AnonPages: 376012 kB' 'Shmem: 2364268 kB' 'KernelStack: 12088 kB' 'PageTables: 5484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220220 kB' 'Slab: 522168 kB' 'SReclaimable: 220220 kB' 'SUnreclaim: 301948 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.512 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:59.513 node0=1024 expecting 1024 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:59.513 07:11:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:03.713 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:06:03.713 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:06:03.713 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43510136 kB' 'MemAvailable: 47500344 kB' 'Buffers: 6064 kB' 'Cached: 10569620 kB' 'SwapCached: 0 kB' 'Active: 7413700 kB' 'Inactive: 3689560 kB' 'Active(anon): 7015276 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530700 kB' 'Mapped: 200272 kB' 'Shmem: 6487700 kB' 'KReclaimable: 549932 kB' 'Slab: 1200896 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 650964 kB' 'KernelStack: 22272 kB' 'PageTables: 9032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8485368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218956 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.713 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.714 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43510096 kB' 'MemAvailable: 47500304 kB' 'Buffers: 6064 kB' 'Cached: 10569624 kB' 'SwapCached: 0 kB' 'Active: 7413312 kB' 'Inactive: 3689560 kB' 'Active(anon): 7014888 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530388 kB' 'Mapped: 200260 kB' 'Shmem: 6487704 kB' 'KReclaimable: 549932 kB' 'Slab: 1200948 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 651016 kB' 'KernelStack: 22176 kB' 'PageTables: 8940 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8488000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218956 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.715 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.716 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43510180 kB' 'MemAvailable: 47500388 kB' 'Buffers: 6064 kB' 'Cached: 10569640 kB' 'SwapCached: 0 kB' 'Active: 7413488 kB' 'Inactive: 3689560 kB' 'Active(anon): 7015064 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530516 kB' 'Mapped: 200256 kB' 'Shmem: 6487720 kB' 'KReclaimable: 549932 kB' 'Slab: 1201012 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 651080 kB' 'KernelStack: 22240 kB' 'PageTables: 8556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8485160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218940 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.717 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.718 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:06:03.719 nr_hugepages=1024 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:03.719 resv_hugepages=0 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:03.719 surplus_hugepages=0 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:03.719 anon_hugepages=0 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 43510300 kB' 'MemAvailable: 47500508 kB' 'Buffers: 6064 kB' 'Cached: 10569664 kB' 'SwapCached: 0 kB' 'Active: 7413872 kB' 'Inactive: 3689560 kB' 'Active(anon): 7015448 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530856 kB' 'Mapped: 200256 kB' 'Shmem: 6487744 kB' 'KReclaimable: 549932 kB' 'Slab: 1200948 kB' 'SReclaimable: 549932 kB' 'SUnreclaim: 651016 kB' 'KernelStack: 22192 kB' 'PageTables: 8728 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 8485432 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218972 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3448180 kB' 'DirectMap2M: 19306496 kB' 'DirectMap1G: 46137344 kB' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.719 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.720 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25835456 kB' 'MemUsed: 6803684 kB' 'SwapCached: 0 kB' 'Active: 2870760 kB' 'Inactive: 231284 kB' 'Active(anon): 2737712 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2728624 kB' 'Mapped: 98212 kB' 'AnonPages: 377000 kB' 'Shmem: 2364292 kB' 'KernelStack: 12360 kB' 'PageTables: 6072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 220220 kB' 'Slab: 522384 kB' 'SReclaimable: 220220 kB' 'SUnreclaim: 302164 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.721 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.722 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:06:03.723 node0=1024 expecting 1024 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:06:03.723 00:06:03.723 real 0m8.731s 00:06:03.723 user 0m3.219s 00:06:03.723 sys 0m5.664s 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:03.723 07:11:36 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:06:03.723 ************************************ 00:06:03.723 END TEST no_shrink_alloc 00:06:03.723 ************************************ 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:06:03.983 07:11:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:06:03.983 00:06:03.983 real 0m33.288s 00:06:03.983 user 0m11.765s 00:06:03.983 sys 0m20.169s 00:06:03.983 07:11:36 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:03.983 07:11:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:03.983 ************************************ 00:06:03.983 END TEST hugepages 00:06:03.983 ************************************ 00:06:03.983 07:11:36 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:06:03.983 07:11:36 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:03.983 07:11:36 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:03.983 07:11:36 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:03.983 ************************************ 00:06:03.983 START TEST driver 00:06:03.983 ************************************ 00:06:03.983 07:11:36 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:06:03.983 * Looking for test storage... 00:06:03.983 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:03.983 07:11:36 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:06:03.983 07:11:36 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:03.983 07:11:36 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:10.555 07:11:42 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:06:10.555 07:11:42 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:10.555 07:11:42 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:10.555 07:11:42 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:06:10.555 ************************************ 00:06:10.555 START TEST guess_driver 00:06:10.555 ************************************ 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 256 > 0 )) 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:06:10.555 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:06:10.555 Looking for driver=vfio-pci 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:06:10.555 07:11:42 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:13.844 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.105 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:14.106 07:11:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:16.009 07:11:48 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:22.572 00:06:22.572 real 0m11.913s 00:06:22.572 user 0m3.154s 00:06:22.572 sys 0m6.108s 00:06:22.572 07:11:54 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:22.572 07:11:54 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:06:22.572 ************************************ 00:06:22.572 END TEST guess_driver 00:06:22.572 ************************************ 00:06:22.572 00:06:22.572 real 0m17.780s 00:06:22.572 user 0m4.853s 00:06:22.572 sys 0m9.434s 00:06:22.572 07:11:54 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:22.572 07:11:54 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:06:22.572 ************************************ 00:06:22.572 END TEST driver 00:06:22.572 ************************************ 00:06:22.572 07:11:54 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:22.572 07:11:54 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:22.572 07:11:54 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:22.572 07:11:54 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:22.572 ************************************ 00:06:22.572 START TEST devices 00:06:22.572 ************************************ 00:06:22.572 07:11:54 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:22.572 * Looking for test storage... 00:06:22.572 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:22.572 07:11:54 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:06:22.572 07:11:54 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:06:22.572 07:11:54 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:22.572 07:11:54 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:06:26.764 07:11:58 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:06:26.764 No valid GPT data, bailing 00:06:26.764 07:11:58 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:06:26.764 07:11:58 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:06:26.764 07:11:58 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:06:26.764 07:11:58 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.764 07:11:58 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:26.764 ************************************ 00:06:26.764 START TEST nvme_mount 00:06:26.764 ************************************ 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:26.764 07:11:59 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:06:27.699 Creating new GPT entries in memory. 00:06:27.699 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:27.699 other utilities. 00:06:27.699 07:12:00 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:27.699 07:12:00 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:27.699 07:12:00 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:27.699 07:12:00 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:27.699 07:12:00 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:28.631 Creating new GPT entries in memory. 00:06:28.631 The operation has completed successfully. 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1512157 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:06:28.631 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:28.887 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:28.887 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:28.888 07:12:01 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:33.073 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:33.073 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:33.073 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:06:33.073 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:33.073 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:33.073 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.332 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:33.332 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:33.332 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:33.332 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.332 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:33.332 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:33.333 07:12:05 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:37.528 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:37.529 07:12:09 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:41.724 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:41.724 00:06:41.724 real 0m14.816s 00:06:41.724 user 0m4.479s 00:06:41.724 sys 0m8.297s 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:41.724 07:12:13 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:41.724 ************************************ 00:06:41.724 END TEST nvme_mount 00:06:41.724 ************************************ 00:06:41.724 07:12:13 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:41.724 07:12:13 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:41.724 07:12:13 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:41.724 07:12:13 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:41.724 ************************************ 00:06:41.724 START TEST dm_mount 00:06:41.724 ************************************ 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:41.724 07:12:13 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:42.660 Creating new GPT entries in memory. 00:06:42.660 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:42.660 other utilities. 00:06:42.660 07:12:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:42.660 07:12:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:42.660 07:12:14 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:42.660 07:12:14 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:42.660 07:12:14 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:43.596 Creating new GPT entries in memory. 00:06:43.596 The operation has completed successfully. 00:06:43.596 07:12:15 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:43.596 07:12:15 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:43.596 07:12:15 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:43.596 07:12:15 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:43.596 07:12:15 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:44.535 The operation has completed successfully. 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1517990 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:44.535 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-2 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-2 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-2 ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-2 ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:44.794 07:12:17 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:20 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2 '' '' 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:48.986 07:12:21 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.270 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\2\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\2* ]] 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:52.529 07:12:24 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:52.529 07:12:25 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:52.529 07:12:25 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:52.529 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:52.529 07:12:25 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:52.529 07:12:25 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:52.529 00:06:52.529 real 0m11.115s 00:06:52.529 user 0m2.738s 00:06:52.529 sys 0m5.359s 00:06:52.529 07:12:25 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:52.529 07:12:25 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:52.529 ************************************ 00:06:52.529 END TEST dm_mount 00:06:52.529 ************************************ 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:52.788 07:12:25 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:53.047 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:53.047 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:06:53.047 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:53.048 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:53.048 07:12:25 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:53.048 00:06:53.048 real 0m31.131s 00:06:53.048 user 0m9.025s 00:06:53.048 sys 0m16.994s 00:06:53.048 07:12:25 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:53.048 07:12:25 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:53.048 ************************************ 00:06:53.048 END TEST devices 00:06:53.048 ************************************ 00:06:53.048 00:06:53.048 real 1m52.107s 00:06:53.048 user 0m35.101s 00:06:53.048 sys 1m4.875s 00:06:53.048 07:12:25 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:53.048 07:12:25 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:53.048 ************************************ 00:06:53.048 END TEST setup.sh 00:06:53.048 ************************************ 00:06:53.048 07:12:25 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:57.237 Hugepages 00:06:57.237 node hugesize free / total 00:06:57.237 node0 1048576kB 0 / 0 00:06:57.237 node0 2048kB 1024 / 1024 00:06:57.237 node1 1048576kB 0 / 0 00:06:57.237 node1 2048kB 1024 / 1024 00:06:57.237 00:06:57.237 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:57.237 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:57.237 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:57.237 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:57.237 07:12:29 -- spdk/autotest.sh@130 -- # uname -s 00:06:57.237 07:12:29 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:06:57.237 07:12:29 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:06:57.237 07:12:29 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:01.424 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:01.424 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:03.325 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:07:03.583 07:12:35 -- common/autotest_common.sh@1532 -- # sleep 1 00:07:04.522 07:12:36 -- common/autotest_common.sh@1533 -- # bdfs=() 00:07:04.522 07:12:36 -- common/autotest_common.sh@1533 -- # local bdfs 00:07:04.522 07:12:36 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:07:04.522 07:12:36 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:07:04.522 07:12:36 -- common/autotest_common.sh@1513 -- # bdfs=() 00:07:04.522 07:12:36 -- common/autotest_common.sh@1513 -- # local bdfs 00:07:04.522 07:12:36 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:04.522 07:12:36 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:04.522 07:12:36 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:07:04.781 07:12:37 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:07:04.781 07:12:37 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:d8:00.0 00:07:04.781 07:12:37 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:08.974 Waiting for block devices as requested 00:07:08.974 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:07:08.974 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:07:08.974 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:07:08.974 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:07:09.234 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:07:09.234 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:07:09.234 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:07:09.493 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:07:09.493 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:07:09.493 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:07:09.753 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:07:09.753 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:07:09.753 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:07:10.012 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:07:10.012 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:07:10.012 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:07:10.272 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:07:10.272 07:12:42 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:07:10.272 07:12:42 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1502 -- # grep 0000:d8:00.0/nvme/nvme 00:07:10.272 07:12:42 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:07:10.272 07:12:42 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:07:10.272 07:12:42 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1545 -- # grep oacs 00:07:10.272 07:12:42 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:07:10.272 07:12:42 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:07:10.272 07:12:42 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:07:10.272 07:12:42 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:07:10.272 07:12:42 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:07:10.272 07:12:42 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:07:10.272 07:12:42 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:07:10.273 07:12:42 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:07:10.273 07:12:42 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:07:10.273 07:12:42 -- common/autotest_common.sh@1557 -- # continue 00:07:10.273 07:12:42 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:07:10.273 07:12:42 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:10.273 07:12:42 -- common/autotest_common.sh@10 -- # set +x 00:07:10.273 07:12:42 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:07:10.273 07:12:42 -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:10.273 07:12:42 -- common/autotest_common.sh@10 -- # set +x 00:07:10.273 07:12:42 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:14.469 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:14.469 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:14.469 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:14.469 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:14.470 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:16.379 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:07:16.379 07:12:48 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:07:16.379 07:12:48 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:16.379 07:12:48 -- common/autotest_common.sh@10 -- # set +x 00:07:16.639 07:12:48 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:07:16.639 07:12:48 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:07:16.639 07:12:48 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:07:16.639 07:12:48 -- common/autotest_common.sh@1577 -- # bdfs=() 00:07:16.639 07:12:48 -- common/autotest_common.sh@1577 -- # local bdfs 00:07:16.639 07:12:48 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:07:16.639 07:12:48 -- common/autotest_common.sh@1513 -- # bdfs=() 00:07:16.639 07:12:48 -- common/autotest_common.sh@1513 -- # local bdfs 00:07:16.639 07:12:48 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:16.639 07:12:48 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:16.639 07:12:48 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:07:16.639 07:12:49 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:07:16.639 07:12:49 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:d8:00.0 00:07:16.639 07:12:49 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:07:16.639 07:12:49 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:07:16.639 07:12:49 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:07:16.639 07:12:49 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:07:16.639 07:12:49 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:07:16.639 07:12:49 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:d8:00.0 00:07:16.639 07:12:49 -- common/autotest_common.sh@1592 -- # [[ -z 0000:d8:00.0 ]] 00:07:16.639 07:12:49 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=1529397 00:07:16.639 07:12:49 -- common/autotest_common.sh@1598 -- # waitforlisten 1529397 00:07:16.639 07:12:49 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:16.639 07:12:49 -- common/autotest_common.sh@831 -- # '[' -z 1529397 ']' 00:07:16.639 07:12:49 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.639 07:12:49 -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:16.639 07:12:49 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.639 07:12:49 -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:16.639 07:12:49 -- common/autotest_common.sh@10 -- # set +x 00:07:16.639 [2024-07-25 07:12:49.157786] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:07:16.639 [2024-07-25 07:12:49.157849] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1529397 ] 00:07:16.899 [2024-07-25 07:12:49.277326] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.899 [2024-07-25 07:12:49.368739] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.835 07:12:50 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:17.836 07:12:50 -- common/autotest_common.sh@864 -- # return 0 00:07:17.836 07:12:50 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:07:17.836 07:12:50 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:07:17.836 07:12:50 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:07:21.118 nvme0n1 00:07:21.118 07:12:53 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:07:21.118 [2024-07-25 07:12:53.341292] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:07:21.118 request: 00:07:21.118 { 00:07:21.118 "nvme_ctrlr_name": "nvme0", 00:07:21.118 "password": "test", 00:07:21.118 "method": "bdev_nvme_opal_revert", 00:07:21.118 "req_id": 1 00:07:21.118 } 00:07:21.118 Got JSON-RPC error response 00:07:21.118 response: 00:07:21.118 { 00:07:21.118 "code": -32602, 00:07:21.118 "message": "Invalid parameters" 00:07:21.118 } 00:07:21.118 07:12:53 -- common/autotest_common.sh@1604 -- # true 00:07:21.118 07:12:53 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:07:21.118 07:12:53 -- common/autotest_common.sh@1608 -- # killprocess 1529397 00:07:21.118 07:12:53 -- common/autotest_common.sh@950 -- # '[' -z 1529397 ']' 00:07:21.118 07:12:53 -- common/autotest_common.sh@954 -- # kill -0 1529397 00:07:21.118 07:12:53 -- common/autotest_common.sh@955 -- # uname 00:07:21.118 07:12:53 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:21.118 07:12:53 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1529397 00:07:21.118 07:12:53 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:21.118 07:12:53 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:21.118 07:12:53 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1529397' 00:07:21.118 killing process with pid 1529397 00:07:21.118 07:12:53 -- common/autotest_common.sh@969 -- # kill 1529397 00:07:21.118 07:12:53 -- common/autotest_common.sh@974 -- # wait 1529397 00:07:23.646 07:12:56 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:07:23.646 07:12:56 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:07:23.646 07:12:56 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:07:23.646 07:12:56 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:07:23.646 07:12:56 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:07:24.583 Restarting all devices. 00:07:31.152 lstat() error: No such file or directory 00:07:31.152 QAT Error: No GENERAL section found 00:07:31.152 Failed to configure qat_dev0 00:07:31.152 lstat() error: No such file or directory 00:07:31.152 QAT Error: No GENERAL section found 00:07:31.152 Failed to configure qat_dev1 00:07:31.152 lstat() error: No such file or directory 00:07:31.152 QAT Error: No GENERAL section found 00:07:31.152 Failed to configure qat_dev2 00:07:31.152 lstat() error: No such file or directory 00:07:31.152 QAT Error: No GENERAL section found 00:07:31.152 Failed to configure qat_dev3 00:07:31.152 lstat() error: No such file or directory 00:07:31.152 QAT Error: No GENERAL section found 00:07:31.152 Failed to configure qat_dev4 00:07:31.152 enable sriov 00:07:31.152 Checking status of all devices. 00:07:31.152 There is 5 QAT acceleration device(s) in the system: 00:07:31.152 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:1a:00.0, #accel: 5 #engines: 10 state: down 00:07:31.152 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:1c:00.0, #accel: 5 #engines: 10 state: down 00:07:31.152 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:1e:00.0, #accel: 5 #engines: 10 state: down 00:07:31.152 qat_dev3 - type: c6xx, inst_id: 3, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:07:31.152 qat_dev4 - type: c6xx, inst_id: 4, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:07:31.152 0000:1a:00.0 set to 16 VFs 00:07:31.718 0000:1c:00.0 set to 16 VFs 00:07:32.654 0000:1e:00.0 set to 16 VFs 00:07:33.222 0000:3d:00.0 set to 16 VFs 00:07:34.160 0000:3f:00.0 set to 16 VFs 00:07:36.727 Properly configured the qat device with driver uio_pci_generic. 00:07:36.727 07:13:08 -- spdk/autotest.sh@162 -- # timing_enter lib 00:07:36.727 07:13:08 -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:36.727 07:13:08 -- common/autotest_common.sh@10 -- # set +x 00:07:36.727 07:13:08 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:07:36.727 07:13:08 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:36.727 07:13:08 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:36.727 07:13:08 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.727 07:13:08 -- common/autotest_common.sh@10 -- # set +x 00:07:36.727 ************************************ 00:07:36.727 START TEST env 00:07:36.727 ************************************ 00:07:36.727 07:13:08 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:36.727 * Looking for test storage... 00:07:36.727 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:07:36.727 07:13:08 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:36.727 07:13:08 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:36.727 07:13:08 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.727 07:13:08 env -- common/autotest_common.sh@10 -- # set +x 00:07:36.727 ************************************ 00:07:36.727 START TEST env_memory 00:07:36.727 ************************************ 00:07:36.727 07:13:08 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:36.727 00:07:36.727 00:07:36.727 CUnit - A unit testing framework for C - Version 2.1-3 00:07:36.727 http://cunit.sourceforge.net/ 00:07:36.727 00:07:36.727 00:07:36.727 Suite: memory 00:07:36.727 Test: alloc and free memory map ...[2024-07-25 07:13:09.011316] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:36.727 passed 00:07:36.727 Test: mem map translation ...[2024-07-25 07:13:09.038221] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:36.727 [2024-07-25 07:13:09.038242] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:36.727 [2024-07-25 07:13:09.038295] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:36.727 [2024-07-25 07:13:09.038306] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:36.727 passed 00:07:36.727 Test: mem map registration ...[2024-07-25 07:13:09.091348] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:07:36.727 [2024-07-25 07:13:09.091369] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:07:36.727 passed 00:07:36.727 Test: mem map adjacent registrations ...passed 00:07:36.727 00:07:36.727 Run Summary: Type Total Ran Passed Failed Inactive 00:07:36.727 suites 1 1 n/a 0 0 00:07:36.727 tests 4 4 4 0 0 00:07:36.727 asserts 152 152 152 0 n/a 00:07:36.727 00:07:36.727 Elapsed time = 0.184 seconds 00:07:36.727 00:07:36.727 real 0m0.198s 00:07:36.727 user 0m0.185s 00:07:36.727 sys 0m0.012s 00:07:36.727 07:13:09 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:36.727 07:13:09 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:07:36.727 ************************************ 00:07:36.727 END TEST env_memory 00:07:36.727 ************************************ 00:07:36.727 07:13:09 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:36.727 07:13:09 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:36.727 07:13:09 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.727 07:13:09 env -- common/autotest_common.sh@10 -- # set +x 00:07:36.727 ************************************ 00:07:36.727 START TEST env_vtophys 00:07:36.727 ************************************ 00:07:36.727 07:13:09 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:36.990 EAL: lib.eal log level changed from notice to debug 00:07:36.990 EAL: Detected lcore 0 as core 0 on socket 0 00:07:36.990 EAL: Detected lcore 1 as core 1 on socket 0 00:07:36.990 EAL: Detected lcore 2 as core 2 on socket 0 00:07:36.990 EAL: Detected lcore 3 as core 3 on socket 0 00:07:36.990 EAL: Detected lcore 4 as core 4 on socket 0 00:07:36.990 EAL: Detected lcore 5 as core 5 on socket 0 00:07:36.990 EAL: Detected lcore 6 as core 6 on socket 0 00:07:36.990 EAL: Detected lcore 7 as core 8 on socket 0 00:07:36.990 EAL: Detected lcore 8 as core 9 on socket 0 00:07:36.990 EAL: Detected lcore 9 as core 10 on socket 0 00:07:36.990 EAL: Detected lcore 10 as core 11 on socket 0 00:07:36.990 EAL: Detected lcore 11 as core 12 on socket 0 00:07:36.990 EAL: Detected lcore 12 as core 13 on socket 0 00:07:36.990 EAL: Detected lcore 13 as core 14 on socket 0 00:07:36.990 EAL: Detected lcore 14 as core 16 on socket 0 00:07:36.990 EAL: Detected lcore 15 as core 17 on socket 0 00:07:36.990 EAL: Detected lcore 16 as core 18 on socket 0 00:07:36.990 EAL: Detected lcore 17 as core 19 on socket 0 00:07:36.990 EAL: Detected lcore 18 as core 20 on socket 0 00:07:36.990 EAL: Detected lcore 19 as core 21 on socket 0 00:07:36.990 EAL: Detected lcore 20 as core 22 on socket 0 00:07:36.990 EAL: Detected lcore 21 as core 24 on socket 0 00:07:36.990 EAL: Detected lcore 22 as core 25 on socket 0 00:07:36.990 EAL: Detected lcore 23 as core 26 on socket 0 00:07:36.990 EAL: Detected lcore 24 as core 27 on socket 0 00:07:36.990 EAL: Detected lcore 25 as core 28 on socket 0 00:07:36.990 EAL: Detected lcore 26 as core 29 on socket 0 00:07:36.990 EAL: Detected lcore 27 as core 30 on socket 0 00:07:36.990 EAL: Detected lcore 28 as core 0 on socket 1 00:07:36.990 EAL: Detected lcore 29 as core 1 on socket 1 00:07:36.990 EAL: Detected lcore 30 as core 2 on socket 1 00:07:36.990 EAL: Detected lcore 31 as core 3 on socket 1 00:07:36.990 EAL: Detected lcore 32 as core 4 on socket 1 00:07:36.990 EAL: Detected lcore 33 as core 5 on socket 1 00:07:36.990 EAL: Detected lcore 34 as core 6 on socket 1 00:07:36.990 EAL: Detected lcore 35 as core 8 on socket 1 00:07:36.990 EAL: Detected lcore 36 as core 9 on socket 1 00:07:36.990 EAL: Detected lcore 37 as core 10 on socket 1 00:07:36.990 EAL: Detected lcore 38 as core 11 on socket 1 00:07:36.990 EAL: Detected lcore 39 as core 12 on socket 1 00:07:36.990 EAL: Detected lcore 40 as core 13 on socket 1 00:07:36.990 EAL: Detected lcore 41 as core 14 on socket 1 00:07:36.990 EAL: Detected lcore 42 as core 16 on socket 1 00:07:36.990 EAL: Detected lcore 43 as core 17 on socket 1 00:07:36.990 EAL: Detected lcore 44 as core 18 on socket 1 00:07:36.990 EAL: Detected lcore 45 as core 19 on socket 1 00:07:36.990 EAL: Detected lcore 46 as core 20 on socket 1 00:07:36.990 EAL: Detected lcore 47 as core 21 on socket 1 00:07:36.990 EAL: Detected lcore 48 as core 22 on socket 1 00:07:36.990 EAL: Detected lcore 49 as core 24 on socket 1 00:07:36.990 EAL: Detected lcore 50 as core 25 on socket 1 00:07:36.990 EAL: Detected lcore 51 as core 26 on socket 1 00:07:36.990 EAL: Detected lcore 52 as core 27 on socket 1 00:07:36.990 EAL: Detected lcore 53 as core 28 on socket 1 00:07:36.990 EAL: Detected lcore 54 as core 29 on socket 1 00:07:36.990 EAL: Detected lcore 55 as core 30 on socket 1 00:07:36.990 EAL: Detected lcore 56 as core 0 on socket 0 00:07:36.990 EAL: Detected lcore 57 as core 1 on socket 0 00:07:36.990 EAL: Detected lcore 58 as core 2 on socket 0 00:07:36.990 EAL: Detected lcore 59 as core 3 on socket 0 00:07:36.990 EAL: Detected lcore 60 as core 4 on socket 0 00:07:36.990 EAL: Detected lcore 61 as core 5 on socket 0 00:07:36.990 EAL: Detected lcore 62 as core 6 on socket 0 00:07:36.990 EAL: Detected lcore 63 as core 8 on socket 0 00:07:36.990 EAL: Detected lcore 64 as core 9 on socket 0 00:07:36.990 EAL: Detected lcore 65 as core 10 on socket 0 00:07:36.990 EAL: Detected lcore 66 as core 11 on socket 0 00:07:36.990 EAL: Detected lcore 67 as core 12 on socket 0 00:07:36.990 EAL: Detected lcore 68 as core 13 on socket 0 00:07:36.990 EAL: Detected lcore 69 as core 14 on socket 0 00:07:36.990 EAL: Detected lcore 70 as core 16 on socket 0 00:07:36.990 EAL: Detected lcore 71 as core 17 on socket 0 00:07:36.990 EAL: Detected lcore 72 as core 18 on socket 0 00:07:36.990 EAL: Detected lcore 73 as core 19 on socket 0 00:07:36.990 EAL: Detected lcore 74 as core 20 on socket 0 00:07:36.990 EAL: Detected lcore 75 as core 21 on socket 0 00:07:36.990 EAL: Detected lcore 76 as core 22 on socket 0 00:07:36.990 EAL: Detected lcore 77 as core 24 on socket 0 00:07:36.990 EAL: Detected lcore 78 as core 25 on socket 0 00:07:36.990 EAL: Detected lcore 79 as core 26 on socket 0 00:07:36.990 EAL: Detected lcore 80 as core 27 on socket 0 00:07:36.990 EAL: Detected lcore 81 as core 28 on socket 0 00:07:36.990 EAL: Detected lcore 82 as core 29 on socket 0 00:07:36.990 EAL: Detected lcore 83 as core 30 on socket 0 00:07:36.990 EAL: Detected lcore 84 as core 0 on socket 1 00:07:36.990 EAL: Detected lcore 85 as core 1 on socket 1 00:07:36.990 EAL: Detected lcore 86 as core 2 on socket 1 00:07:36.990 EAL: Detected lcore 87 as core 3 on socket 1 00:07:36.990 EAL: Detected lcore 88 as core 4 on socket 1 00:07:36.990 EAL: Detected lcore 89 as core 5 on socket 1 00:07:36.990 EAL: Detected lcore 90 as core 6 on socket 1 00:07:36.990 EAL: Detected lcore 91 as core 8 on socket 1 00:07:36.990 EAL: Detected lcore 92 as core 9 on socket 1 00:07:36.990 EAL: Detected lcore 93 as core 10 on socket 1 00:07:36.990 EAL: Detected lcore 94 as core 11 on socket 1 00:07:36.990 EAL: Detected lcore 95 as core 12 on socket 1 00:07:36.990 EAL: Detected lcore 96 as core 13 on socket 1 00:07:36.990 EAL: Detected lcore 97 as core 14 on socket 1 00:07:36.990 EAL: Detected lcore 98 as core 16 on socket 1 00:07:36.990 EAL: Detected lcore 99 as core 17 on socket 1 00:07:36.990 EAL: Detected lcore 100 as core 18 on socket 1 00:07:36.990 EAL: Detected lcore 101 as core 19 on socket 1 00:07:36.990 EAL: Detected lcore 102 as core 20 on socket 1 00:07:36.990 EAL: Detected lcore 103 as core 21 on socket 1 00:07:36.990 EAL: Detected lcore 104 as core 22 on socket 1 00:07:36.990 EAL: Detected lcore 105 as core 24 on socket 1 00:07:36.990 EAL: Detected lcore 106 as core 25 on socket 1 00:07:36.990 EAL: Detected lcore 107 as core 26 on socket 1 00:07:36.990 EAL: Detected lcore 108 as core 27 on socket 1 00:07:36.990 EAL: Detected lcore 109 as core 28 on socket 1 00:07:36.990 EAL: Detected lcore 110 as core 29 on socket 1 00:07:36.990 EAL: Detected lcore 111 as core 30 on socket 1 00:07:36.990 EAL: Maximum logical cores by configuration: 128 00:07:36.990 EAL: Detected CPU lcores: 112 00:07:36.990 EAL: Detected NUMA nodes: 2 00:07:36.990 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:07:36.990 EAL: Detected shared linkage of DPDK 00:07:36.990 EAL: No shared files mode enabled, IPC will be disabled 00:07:36.990 EAL: No shared files mode enabled, IPC is disabled 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.0 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.1 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.2 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.3 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.4 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.5 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.6 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:01.7 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.0 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.1 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.2 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.3 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.4 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.5 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.6 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1a:02.7 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.0 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.1 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.2 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.3 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.4 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.5 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.6 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:01.7 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:02.0 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:02.1 wants IOVA as 'PA' 00:07:36.990 EAL: PCI driver qat for device 0000:1c:02.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1c:02.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1c:02.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1c:02.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1c:02.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1c:02.7 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.0 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.1 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:01.7 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.0 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.1 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:1e:02.7 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:07:36.991 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:07:36.991 EAL: Bus pci wants IOVA as 'PA' 00:07:36.991 EAL: Bus auxiliary wants IOVA as 'DC' 00:07:36.991 EAL: Bus vdev wants IOVA as 'DC' 00:07:36.991 EAL: Selected IOVA mode 'PA' 00:07:36.991 EAL: Probing VFIO support... 00:07:36.991 EAL: IOMMU type 1 (Type 1) is supported 00:07:36.991 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:36.991 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:36.991 EAL: VFIO support initialized 00:07:36.991 EAL: Ask a virtual area of 0x2e000 bytes 00:07:36.991 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:36.991 EAL: Setting up physically contiguous memory... 00:07:36.991 EAL: Setting maximum number of open files to 524288 00:07:36.991 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:36.991 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:36.991 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:36.991 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:36.991 EAL: Ask a virtual area of 0x61000 bytes 00:07:36.991 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:36.991 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:36.991 EAL: Ask a virtual area of 0x400000000 bytes 00:07:36.991 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:36.991 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:36.991 EAL: Hugepages will be freed exactly as allocated. 00:07:36.991 EAL: No shared files mode enabled, IPC is disabled 00:07:36.991 EAL: No shared files mode enabled, IPC is disabled 00:07:36.991 EAL: TSC frequency is ~2500000 KHz 00:07:36.991 EAL: Main lcore 0 is ready (tid=7f90f4955b00;cpuset=[0]) 00:07:36.991 EAL: Trying to obtain current memory policy. 00:07:36.991 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.991 EAL: Restoring previous memory policy: 0 00:07:36.992 EAL: request: mp_malloc_sync 00:07:36.992 EAL: No shared files mode enabled, IPC is disabled 00:07:36.992 EAL: Heap on socket 0 was expanded by 2MB 00:07:36.992 EAL: PCI device 0000:1a:01.0 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001000000 00:07:36.992 EAL: PCI memory mapped at 0x202001001000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.1 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001002000 00:07:36.992 EAL: PCI memory mapped at 0x202001003000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.2 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001004000 00:07:36.992 EAL: PCI memory mapped at 0x202001005000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.3 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001006000 00:07:36.992 EAL: PCI memory mapped at 0x202001007000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.4 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001008000 00:07:36.992 EAL: PCI memory mapped at 0x202001009000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.5 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200100a000 00:07:36.992 EAL: PCI memory mapped at 0x20200100b000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.6 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200100c000 00:07:36.992 EAL: PCI memory mapped at 0x20200100d000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:01.7 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200100e000 00:07:36.992 EAL: PCI memory mapped at 0x20200100f000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.0 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001010000 00:07:36.992 EAL: PCI memory mapped at 0x202001011000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.1 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001012000 00:07:36.992 EAL: PCI memory mapped at 0x202001013000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.2 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001014000 00:07:36.992 EAL: PCI memory mapped at 0x202001015000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.3 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001016000 00:07:36.992 EAL: PCI memory mapped at 0x202001017000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.4 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001018000 00:07:36.992 EAL: PCI memory mapped at 0x202001019000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.5 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200101a000 00:07:36.992 EAL: PCI memory mapped at 0x20200101b000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.6 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200101c000 00:07:36.992 EAL: PCI memory mapped at 0x20200101d000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:07:36.992 EAL: PCI device 0000:1a:02.7 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200101e000 00:07:36.992 EAL: PCI memory mapped at 0x20200101f000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.0 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001020000 00:07:36.992 EAL: PCI memory mapped at 0x202001021000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.1 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001022000 00:07:36.992 EAL: PCI memory mapped at 0x202001023000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.2 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001024000 00:07:36.992 EAL: PCI memory mapped at 0x202001025000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.3 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001026000 00:07:36.992 EAL: PCI memory mapped at 0x202001027000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.4 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001028000 00:07:36.992 EAL: PCI memory mapped at 0x202001029000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.5 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200102a000 00:07:36.992 EAL: PCI memory mapped at 0x20200102b000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.6 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200102c000 00:07:36.992 EAL: PCI memory mapped at 0x20200102d000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:01.7 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x20200102e000 00:07:36.992 EAL: PCI memory mapped at 0x20200102f000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:02.0 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001030000 00:07:36.992 EAL: PCI memory mapped at 0x202001031000 00:07:36.992 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:07:36.992 EAL: PCI device 0000:1c:02.1 on NUMA socket 0 00:07:36.992 EAL: probe driver: 8086:37c9 qat 00:07:36.992 EAL: PCI memory mapped at 0x202001032000 00:07:36.992 EAL: PCI memory mapped at 0x202001033000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:07:36.993 EAL: PCI device 0000:1c:02.2 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001034000 00:07:36.993 EAL: PCI memory mapped at 0x202001035000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:07:36.993 EAL: PCI device 0000:1c:02.3 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001036000 00:07:36.993 EAL: PCI memory mapped at 0x202001037000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:07:36.993 EAL: PCI device 0000:1c:02.4 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001038000 00:07:36.993 EAL: PCI memory mapped at 0x202001039000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:07:36.993 EAL: PCI device 0000:1c:02.5 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200103a000 00:07:36.993 EAL: PCI memory mapped at 0x20200103b000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:07:36.993 EAL: PCI device 0000:1c:02.6 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200103c000 00:07:36.993 EAL: PCI memory mapped at 0x20200103d000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:07:36.993 EAL: PCI device 0000:1c:02.7 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200103e000 00:07:36.993 EAL: PCI memory mapped at 0x20200103f000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.0 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001040000 00:07:36.993 EAL: PCI memory mapped at 0x202001041000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.1 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001042000 00:07:36.993 EAL: PCI memory mapped at 0x202001043000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.2 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001044000 00:07:36.993 EAL: PCI memory mapped at 0x202001045000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.3 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001046000 00:07:36.993 EAL: PCI memory mapped at 0x202001047000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.4 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001048000 00:07:36.993 EAL: PCI memory mapped at 0x202001049000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.5 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200104a000 00:07:36.993 EAL: PCI memory mapped at 0x20200104b000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.6 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200104c000 00:07:36.993 EAL: PCI memory mapped at 0x20200104d000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:01.7 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200104e000 00:07:36.993 EAL: PCI memory mapped at 0x20200104f000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.0 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001050000 00:07:36.993 EAL: PCI memory mapped at 0x202001051000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.1 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001052000 00:07:36.993 EAL: PCI memory mapped at 0x202001053000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.2 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001054000 00:07:36.993 EAL: PCI memory mapped at 0x202001055000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.3 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001056000 00:07:36.993 EAL: PCI memory mapped at 0x202001057000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.4 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001058000 00:07:36.993 EAL: PCI memory mapped at 0x202001059000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.5 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200105a000 00:07:36.993 EAL: PCI memory mapped at 0x20200105b000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.6 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200105c000 00:07:36.993 EAL: PCI memory mapped at 0x20200105d000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:07:36.993 EAL: PCI device 0000:1e:02.7 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x20200105e000 00:07:36.993 EAL: PCI memory mapped at 0x20200105f000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:07:36.993 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001060000 00:07:36.993 EAL: PCI memory mapped at 0x202001061000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:36.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.993 EAL: PCI memory unmapped at 0x202001060000 00:07:36.993 EAL: PCI memory unmapped at 0x202001061000 00:07:36.993 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:36.993 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001062000 00:07:36.993 EAL: PCI memory mapped at 0x202001063000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:36.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.993 EAL: PCI memory unmapped at 0x202001062000 00:07:36.993 EAL: PCI memory unmapped at 0x202001063000 00:07:36.993 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:36.993 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001064000 00:07:36.993 EAL: PCI memory mapped at 0x202001065000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:36.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.993 EAL: PCI memory unmapped at 0x202001064000 00:07:36.993 EAL: PCI memory unmapped at 0x202001065000 00:07:36.993 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:36.993 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001066000 00:07:36.993 EAL: PCI memory mapped at 0x202001067000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:36.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.993 EAL: PCI memory unmapped at 0x202001066000 00:07:36.993 EAL: PCI memory unmapped at 0x202001067000 00:07:36.993 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:36.993 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:07:36.993 EAL: probe driver: 8086:37c9 qat 00:07:36.993 EAL: PCI memory mapped at 0x202001068000 00:07:36.993 EAL: PCI memory mapped at 0x202001069000 00:07:36.993 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:36.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.993 EAL: PCI memory unmapped at 0x202001068000 00:07:36.994 EAL: PCI memory unmapped at 0x202001069000 00:07:36.994 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x20200106a000 00:07:36.994 EAL: PCI memory mapped at 0x20200106b000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x20200106a000 00:07:36.994 EAL: PCI memory unmapped at 0x20200106b000 00:07:36.994 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x20200106c000 00:07:36.994 EAL: PCI memory mapped at 0x20200106d000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x20200106c000 00:07:36.994 EAL: PCI memory unmapped at 0x20200106d000 00:07:36.994 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x20200106e000 00:07:36.994 EAL: PCI memory mapped at 0x20200106f000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x20200106e000 00:07:36.994 EAL: PCI memory unmapped at 0x20200106f000 00:07:36.994 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001070000 00:07:36.994 EAL: PCI memory mapped at 0x202001071000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001070000 00:07:36.994 EAL: PCI memory unmapped at 0x202001071000 00:07:36.994 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001072000 00:07:36.994 EAL: PCI memory mapped at 0x202001073000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001072000 00:07:36.994 EAL: PCI memory unmapped at 0x202001073000 00:07:36.994 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001074000 00:07:36.994 EAL: PCI memory mapped at 0x202001075000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001074000 00:07:36.994 EAL: PCI memory unmapped at 0x202001075000 00:07:36.994 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001076000 00:07:36.994 EAL: PCI memory mapped at 0x202001077000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001076000 00:07:36.994 EAL: PCI memory unmapped at 0x202001077000 00:07:36.994 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001078000 00:07:36.994 EAL: PCI memory mapped at 0x202001079000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001078000 00:07:36.994 EAL: PCI memory unmapped at 0x202001079000 00:07:36.994 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x20200107a000 00:07:36.994 EAL: PCI memory mapped at 0x20200107b000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x20200107a000 00:07:36.994 EAL: PCI memory unmapped at 0x20200107b000 00:07:36.994 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x20200107c000 00:07:36.994 EAL: PCI memory mapped at 0x20200107d000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x20200107c000 00:07:36.994 EAL: PCI memory unmapped at 0x20200107d000 00:07:36.994 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:36.994 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x20200107e000 00:07:36.994 EAL: PCI memory mapped at 0x20200107f000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x20200107e000 00:07:36.994 EAL: PCI memory unmapped at 0x20200107f000 00:07:36.994 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:36.994 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001080000 00:07:36.994 EAL: PCI memory mapped at 0x202001081000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001080000 00:07:36.994 EAL: PCI memory unmapped at 0x202001081000 00:07:36.994 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:36.994 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001082000 00:07:36.994 EAL: PCI memory mapped at 0x202001083000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001082000 00:07:36.994 EAL: PCI memory unmapped at 0x202001083000 00:07:36.994 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:36.994 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001084000 00:07:36.994 EAL: PCI memory mapped at 0x202001085000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001084000 00:07:36.994 EAL: PCI memory unmapped at 0x202001085000 00:07:36.994 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:36.994 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001086000 00:07:36.994 EAL: PCI memory mapped at 0x202001087000 00:07:36.994 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:36.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.994 EAL: PCI memory unmapped at 0x202001086000 00:07:36.994 EAL: PCI memory unmapped at 0x202001087000 00:07:36.994 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:36.994 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:07:36.994 EAL: probe driver: 8086:37c9 qat 00:07:36.994 EAL: PCI memory mapped at 0x202001088000 00:07:36.994 EAL: PCI memory mapped at 0x202001089000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x202001088000 00:07:36.995 EAL: PCI memory unmapped at 0x202001089000 00:07:36.995 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x20200108a000 00:07:36.995 EAL: PCI memory mapped at 0x20200108b000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x20200108a000 00:07:36.995 EAL: PCI memory unmapped at 0x20200108b000 00:07:36.995 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x20200108c000 00:07:36.995 EAL: PCI memory mapped at 0x20200108d000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x20200108c000 00:07:36.995 EAL: PCI memory unmapped at 0x20200108d000 00:07:36.995 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x20200108e000 00:07:36.995 EAL: PCI memory mapped at 0x20200108f000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x20200108e000 00:07:36.995 EAL: PCI memory unmapped at 0x20200108f000 00:07:36.995 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x202001090000 00:07:36.995 EAL: PCI memory mapped at 0x202001091000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x202001090000 00:07:36.995 EAL: PCI memory unmapped at 0x202001091000 00:07:36.995 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x202001092000 00:07:36.995 EAL: PCI memory mapped at 0x202001093000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x202001092000 00:07:36.995 EAL: PCI memory unmapped at 0x202001093000 00:07:36.995 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x202001094000 00:07:36.995 EAL: PCI memory mapped at 0x202001095000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x202001094000 00:07:36.995 EAL: PCI memory unmapped at 0x202001095000 00:07:36.995 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x202001096000 00:07:36.995 EAL: PCI memory mapped at 0x202001097000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x202001096000 00:07:36.995 EAL: PCI memory unmapped at 0x202001097000 00:07:36.995 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x202001098000 00:07:36.995 EAL: PCI memory mapped at 0x202001099000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x202001098000 00:07:36.995 EAL: PCI memory unmapped at 0x202001099000 00:07:36.995 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x20200109a000 00:07:36.995 EAL: PCI memory mapped at 0x20200109b000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x20200109a000 00:07:36.995 EAL: PCI memory unmapped at 0x20200109b000 00:07:36.995 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x20200109c000 00:07:36.995 EAL: PCI memory mapped at 0x20200109d000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x20200109c000 00:07:36.995 EAL: PCI memory unmapped at 0x20200109d000 00:07:36.995 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:36.995 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:07:36.995 EAL: probe driver: 8086:37c9 qat 00:07:36.995 EAL: PCI memory mapped at 0x20200109e000 00:07:36.995 EAL: PCI memory mapped at 0x20200109f000 00:07:36.995 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:36.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:36.995 EAL: PCI memory unmapped at 0x20200109e000 00:07:36.995 EAL: PCI memory unmapped at 0x20200109f000 00:07:36.995 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: No PCI address specified using 'addr=' in: bus=pci 00:07:36.995 EAL: Mem event callback 'spdk:(nil)' registered 00:07:36.995 00:07:36.995 00:07:36.995 CUnit - A unit testing framework for C - Version 2.1-3 00:07:36.995 http://cunit.sourceforge.net/ 00:07:36.995 00:07:36.995 00:07:36.995 Suite: components_suite 00:07:36.995 Test: vtophys_malloc_test ...passed 00:07:36.995 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:36.995 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.995 EAL: Restoring previous memory policy: 4 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was expanded by 4MB 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was shrunk by 4MB 00:07:36.995 EAL: Trying to obtain current memory policy. 00:07:36.995 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.995 EAL: Restoring previous memory policy: 4 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was expanded by 6MB 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was shrunk by 6MB 00:07:36.995 EAL: Trying to obtain current memory policy. 00:07:36.995 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.995 EAL: Restoring previous memory policy: 4 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was expanded by 10MB 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was shrunk by 10MB 00:07:36.995 EAL: Trying to obtain current memory policy. 00:07:36.995 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.995 EAL: Restoring previous memory policy: 4 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was expanded by 18MB 00:07:36.995 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.995 EAL: request: mp_malloc_sync 00:07:36.995 EAL: No shared files mode enabled, IPC is disabled 00:07:36.995 EAL: Heap on socket 0 was shrunk by 18MB 00:07:36.997 EAL: Trying to obtain current memory policy. 00:07:36.997 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.997 EAL: Restoring previous memory policy: 4 00:07:36.997 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.997 EAL: request: mp_malloc_sync 00:07:36.997 EAL: No shared files mode enabled, IPC is disabled 00:07:36.997 EAL: Heap on socket 0 was expanded by 34MB 00:07:36.997 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.997 EAL: request: mp_malloc_sync 00:07:36.997 EAL: No shared files mode enabled, IPC is disabled 00:07:36.997 EAL: Heap on socket 0 was shrunk by 34MB 00:07:36.997 EAL: Trying to obtain current memory policy. 00:07:36.997 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.997 EAL: Restoring previous memory policy: 4 00:07:36.997 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.997 EAL: request: mp_malloc_sync 00:07:36.997 EAL: No shared files mode enabled, IPC is disabled 00:07:36.997 EAL: Heap on socket 0 was expanded by 66MB 00:07:36.997 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.997 EAL: request: mp_malloc_sync 00:07:36.997 EAL: No shared files mode enabled, IPC is disabled 00:07:36.997 EAL: Heap on socket 0 was shrunk by 66MB 00:07:36.997 EAL: Trying to obtain current memory policy. 00:07:36.997 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:36.997 EAL: Restoring previous memory policy: 4 00:07:36.997 EAL: Calling mem event callback 'spdk:(nil)' 00:07:36.997 EAL: request: mp_malloc_sync 00:07:36.997 EAL: No shared files mode enabled, IPC is disabled 00:07:36.997 EAL: Heap on socket 0 was expanded by 130MB 00:07:36.997 EAL: Calling mem event callback 'spdk:(nil)' 00:07:37.257 EAL: request: mp_malloc_sync 00:07:37.257 EAL: No shared files mode enabled, IPC is disabled 00:07:37.257 EAL: Heap on socket 0 was shrunk by 130MB 00:07:37.257 EAL: Trying to obtain current memory policy. 00:07:37.257 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:37.257 EAL: Restoring previous memory policy: 4 00:07:37.257 EAL: Calling mem event callback 'spdk:(nil)' 00:07:37.257 EAL: request: mp_malloc_sync 00:07:37.257 EAL: No shared files mode enabled, IPC is disabled 00:07:37.257 EAL: Heap on socket 0 was expanded by 258MB 00:07:37.257 EAL: Calling mem event callback 'spdk:(nil)' 00:07:37.257 EAL: request: mp_malloc_sync 00:07:37.257 EAL: No shared files mode enabled, IPC is disabled 00:07:37.257 EAL: Heap on socket 0 was shrunk by 258MB 00:07:37.257 EAL: Trying to obtain current memory policy. 00:07:37.257 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:37.257 EAL: Restoring previous memory policy: 4 00:07:37.257 EAL: Calling mem event callback 'spdk:(nil)' 00:07:37.257 EAL: request: mp_malloc_sync 00:07:37.257 EAL: No shared files mode enabled, IPC is disabled 00:07:37.257 EAL: Heap on socket 0 was expanded by 514MB 00:07:37.516 EAL: Calling mem event callback 'spdk:(nil)' 00:07:37.516 EAL: request: mp_malloc_sync 00:07:37.516 EAL: No shared files mode enabled, IPC is disabled 00:07:37.516 EAL: Heap on socket 0 was shrunk by 514MB 00:07:37.516 EAL: Trying to obtain current memory policy. 00:07:37.516 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:37.776 EAL: Restoring previous memory policy: 4 00:07:37.776 EAL: Calling mem event callback 'spdk:(nil)' 00:07:37.776 EAL: request: mp_malloc_sync 00:07:37.776 EAL: No shared files mode enabled, IPC is disabled 00:07:37.776 EAL: Heap on socket 0 was expanded by 1026MB 00:07:37.776 EAL: Calling mem event callback 'spdk:(nil)' 00:07:38.036 EAL: request: mp_malloc_sync 00:07:38.036 EAL: No shared files mode enabled, IPC is disabled 00:07:38.036 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:38.036 passed 00:07:38.036 00:07:38.036 Run Summary: Type Total Ran Passed Failed Inactive 00:07:38.036 suites 1 1 n/a 0 0 00:07:38.036 tests 2 2 2 0 0 00:07:38.036 asserts 6583 6583 6583 0 n/a 00:07:38.036 00:07:38.036 Elapsed time = 1.016 seconds 00:07:38.036 EAL: No shared files mode enabled, IPC is disabled 00:07:38.036 EAL: No shared files mode enabled, IPC is disabled 00:07:38.036 EAL: No shared files mode enabled, IPC is disabled 00:07:38.036 00:07:38.036 real 0m1.213s 00:07:38.036 user 0m0.675s 00:07:38.036 sys 0m0.514s 00:07:38.036 07:13:10 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:38.036 07:13:10 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:07:38.036 ************************************ 00:07:38.036 END TEST env_vtophys 00:07:38.036 ************************************ 00:07:38.036 07:13:10 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:38.036 07:13:10 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:38.036 07:13:10 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:38.036 07:13:10 env -- common/autotest_common.sh@10 -- # set +x 00:07:38.036 ************************************ 00:07:38.036 START TEST env_pci 00:07:38.036 ************************************ 00:07:38.036 07:13:10 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:38.036 00:07:38.036 00:07:38.036 CUnit - A unit testing framework for C - Version 2.1-3 00:07:38.036 http://cunit.sourceforge.net/ 00:07:38.036 00:07:38.036 00:07:38.036 Suite: pci 00:07:38.036 Test: pci_hook ...[2024-07-25 07:13:10.567327] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1533198 has claimed it 00:07:38.297 EAL: Cannot find device (10000:00:01.0) 00:07:38.297 EAL: Failed to attach device on primary process 00:07:38.297 passed 00:07:38.297 00:07:38.297 Run Summary: Type Total Ran Passed Failed Inactive 00:07:38.297 suites 1 1 n/a 0 0 00:07:38.297 tests 1 1 1 0 0 00:07:38.297 asserts 25 25 25 0 n/a 00:07:38.297 00:07:38.297 Elapsed time = 0.045 seconds 00:07:38.297 00:07:38.297 real 0m0.073s 00:07:38.297 user 0m0.020s 00:07:38.297 sys 0m0.053s 00:07:38.297 07:13:10 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:38.297 07:13:10 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:07:38.297 ************************************ 00:07:38.297 END TEST env_pci 00:07:38.297 ************************************ 00:07:38.297 07:13:10 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:38.297 07:13:10 env -- env/env.sh@15 -- # uname 00:07:38.297 07:13:10 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:38.297 07:13:10 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:38.297 07:13:10 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:38.297 07:13:10 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:07:38.297 07:13:10 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:38.297 07:13:10 env -- common/autotest_common.sh@10 -- # set +x 00:07:38.297 ************************************ 00:07:38.297 START TEST env_dpdk_post_init 00:07:38.297 ************************************ 00:07:38.297 07:13:10 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:38.298 EAL: Detected CPU lcores: 112 00:07:38.298 EAL: Detected NUMA nodes: 2 00:07:38.298 EAL: Detected shared linkage of DPDK 00:07:38.298 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:38.298 EAL: Selected IOVA mode 'PA' 00:07:38.298 EAL: VFIO support initialized 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.298 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:07:38.298 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:07:38.298 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.299 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:07:38.299 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.299 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:07:38.300 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:07:38.300 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.300 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:07:38.300 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:07:38.300 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:07:38.300 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:38.300 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:07:38.300 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:38.300 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:38.300 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.300 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:38.300 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:38.560 EAL: Using IOMMU type 1 (Type 1) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:07:38.560 EAL: Ignore mapping IO port bar(1) 00:07:38.560 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.560 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:38.560 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:38.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:38.561 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:38.561 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:38.561 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:38.561 EAL: Ignore mapping IO port bar(1) 00:07:38.561 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:07:38.561 EAL: Ignore mapping IO port bar(1) 00:07:38.561 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:07:38.561 EAL: Ignore mapping IO port bar(1) 00:07:38.561 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:07:38.561 EAL: Ignore mapping IO port bar(1) 00:07:38.561 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:07:38.820 EAL: Ignore mapping IO port bar(1) 00:07:38.820 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:07:38.820 EAL: Ignore mapping IO port bar(1) 00:07:38.820 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:07:38.820 EAL: Ignore mapping IO port bar(1) 00:07:38.820 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:07:38.820 EAL: Ignore mapping IO port bar(1) 00:07:38.820 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:07:39.390 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:07:43.581 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:07:43.581 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001120000 00:07:43.843 Starting DPDK initialization... 00:07:43.843 Starting SPDK post initialization... 00:07:43.843 SPDK NVMe probe 00:07:43.843 Attaching to 0000:d8:00.0 00:07:43.843 Attached to 0000:d8:00.0 00:07:43.843 Cleaning up... 00:07:43.843 00:07:43.843 real 0m5.452s 00:07:43.843 user 0m3.978s 00:07:43.843 sys 0m0.523s 00:07:43.843 07:13:16 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:43.843 07:13:16 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:07:43.843 ************************************ 00:07:43.843 END TEST env_dpdk_post_init 00:07:43.843 ************************************ 00:07:43.843 07:13:16 env -- env/env.sh@26 -- # uname 00:07:43.843 07:13:16 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:43.843 07:13:16 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:43.843 07:13:16 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:43.843 07:13:16 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:43.843 07:13:16 env -- common/autotest_common.sh@10 -- # set +x 00:07:43.843 ************************************ 00:07:43.843 START TEST env_mem_callbacks 00:07:43.843 ************************************ 00:07:43.843 07:13:16 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:43.843 EAL: Detected CPU lcores: 112 00:07:43.843 EAL: Detected NUMA nodes: 2 00:07:43.843 EAL: Detected shared linkage of DPDK 00:07:43.843 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:43.843 EAL: Selected IOVA mode 'PA' 00:07:43.843 EAL: VFIO support initialized 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.843 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:07:43.843 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.843 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:07:43.844 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.844 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:07:43.844 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:43.845 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:07:43.845 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:43.845 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:43.845 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:43.845 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:43.845 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:43.845 00:07:43.845 00:07:43.845 CUnit - A unit testing framework for C - Version 2.1-3 00:07:43.846 http://cunit.sourceforge.net/ 00:07:43.846 00:07:43.846 00:07:43.846 Suite: memory 00:07:43.846 Test: test ... 00:07:43.846 register 0x200000200000 2097152 00:07:43.846 malloc 3145728 00:07:43.846 register 0x200000400000 4194304 00:07:43.846 buf 0x200000500000 len 3145728 PASSED 00:07:43.846 malloc 64 00:07:43.846 buf 0x2000004fff40 len 64 PASSED 00:07:43.846 malloc 4194304 00:07:43.846 register 0x200000800000 6291456 00:07:43.846 buf 0x200000a00000 len 4194304 PASSED 00:07:43.846 free 0x200000500000 3145728 00:07:43.846 free 0x2000004fff40 64 00:07:43.846 unregister 0x200000400000 4194304 PASSED 00:07:43.846 free 0x200000a00000 4194304 00:07:43.846 unregister 0x200000800000 6291456 PASSED 00:07:43.846 malloc 8388608 00:07:43.846 register 0x200000400000 10485760 00:07:43.846 buf 0x200000600000 len 8388608 PASSED 00:07:43.846 free 0x200000600000 8388608 00:07:43.846 unregister 0x200000400000 10485760 PASSED 00:07:43.846 passed 00:07:43.846 00:07:43.846 Run Summary: Type Total Ran Passed Failed Inactive 00:07:43.846 suites 1 1 n/a 0 0 00:07:43.846 tests 1 1 1 0 0 00:07:43.846 asserts 15 15 15 0 n/a 00:07:43.846 00:07:43.846 Elapsed time = 0.005 seconds 00:07:43.846 00:07:43.846 real 0m0.107s 00:07:43.846 user 0m0.024s 00:07:43.846 sys 0m0.082s 00:07:43.846 07:13:16 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:43.846 07:13:16 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:07:43.846 ************************************ 00:07:43.846 END TEST env_mem_callbacks 00:07:43.846 ************************************ 00:07:44.105 00:07:44.105 real 0m7.536s 00:07:44.105 user 0m5.041s 00:07:44.105 sys 0m1.560s 00:07:44.105 07:13:16 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:44.105 07:13:16 env -- common/autotest_common.sh@10 -- # set +x 00:07:44.105 ************************************ 00:07:44.105 END TEST env 00:07:44.105 ************************************ 00:07:44.105 07:13:16 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:44.105 07:13:16 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:44.105 07:13:16 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:44.105 07:13:16 -- common/autotest_common.sh@10 -- # set +x 00:07:44.105 ************************************ 00:07:44.105 START TEST rpc 00:07:44.105 ************************************ 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:44.105 * Looking for test storage... 00:07:44.105 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:44.105 07:13:16 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1534376 00:07:44.105 07:13:16 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:44.105 07:13:16 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1534376 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@831 -- # '[' -z 1534376 ']' 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:44.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:44.105 07:13:16 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:44.105 07:13:16 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.105 [2024-07-25 07:13:16.632174] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:07:44.105 [2024-07-25 07:13:16.632235] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1534376 ] 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:44.365 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.365 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:44.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.366 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:44.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.366 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:44.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.366 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:44.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.366 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:44.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:44.366 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:44.366 [2024-07-25 07:13:16.762952] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.366 [2024-07-25 07:13:16.847416] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:44.366 [2024-07-25 07:13:16.847461] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1534376' to capture a snapshot of events at runtime. 00:07:44.366 [2024-07-25 07:13:16.847475] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:44.366 [2024-07-25 07:13:16.847487] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:44.366 [2024-07-25 07:13:16.847496] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1534376 for offline analysis/debug. 00:07:44.366 [2024-07-25 07:13:16.847525] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.935 07:13:17 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:44.935 07:13:17 rpc -- common/autotest_common.sh@864 -- # return 0 00:07:44.935 07:13:17 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:44.935 07:13:17 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:44.935 07:13:17 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:44.935 07:13:17 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:44.935 07:13:17 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:44.935 07:13:17 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:44.935 07:13:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.935 ************************************ 00:07:44.935 START TEST rpc_integrity 00:07:44.935 ************************************ 00:07:44.935 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:07:44.935 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:44.935 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.935 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:44.935 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:44.935 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:44.935 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:44.935 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:44.935 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:44.935 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.935 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:45.195 { 00:07:45.195 "name": "Malloc0", 00:07:45.195 "aliases": [ 00:07:45.195 "fdebcd26-0ebb-4fac-8494-9d06fcb1b488" 00:07:45.195 ], 00:07:45.195 "product_name": "Malloc disk", 00:07:45.195 "block_size": 512, 00:07:45.195 "num_blocks": 16384, 00:07:45.195 "uuid": "fdebcd26-0ebb-4fac-8494-9d06fcb1b488", 00:07:45.195 "assigned_rate_limits": { 00:07:45.195 "rw_ios_per_sec": 0, 00:07:45.195 "rw_mbytes_per_sec": 0, 00:07:45.195 "r_mbytes_per_sec": 0, 00:07:45.195 "w_mbytes_per_sec": 0 00:07:45.195 }, 00:07:45.195 "claimed": false, 00:07:45.195 "zoned": false, 00:07:45.195 "supported_io_types": { 00:07:45.195 "read": true, 00:07:45.195 "write": true, 00:07:45.195 "unmap": true, 00:07:45.195 "flush": true, 00:07:45.195 "reset": true, 00:07:45.195 "nvme_admin": false, 00:07:45.195 "nvme_io": false, 00:07:45.195 "nvme_io_md": false, 00:07:45.195 "write_zeroes": true, 00:07:45.195 "zcopy": true, 00:07:45.195 "get_zone_info": false, 00:07:45.195 "zone_management": false, 00:07:45.195 "zone_append": false, 00:07:45.195 "compare": false, 00:07:45.195 "compare_and_write": false, 00:07:45.195 "abort": true, 00:07:45.195 "seek_hole": false, 00:07:45.195 "seek_data": false, 00:07:45.195 "copy": true, 00:07:45.195 "nvme_iov_md": false 00:07:45.195 }, 00:07:45.195 "memory_domains": [ 00:07:45.195 { 00:07:45.195 "dma_device_id": "system", 00:07:45.195 "dma_device_type": 1 00:07:45.195 }, 00:07:45.195 { 00:07:45.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:45.195 "dma_device_type": 2 00:07:45.195 } 00:07:45.195 ], 00:07:45.195 "driver_specific": {} 00:07:45.195 } 00:07:45.195 ]' 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 [2024-07-25 07:13:17.535275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:45.195 [2024-07-25 07:13:17.535311] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:45.195 [2024-07-25 07:13:17.535329] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2557630 00:07:45.195 [2024-07-25 07:13:17.535341] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:45.195 [2024-07-25 07:13:17.536802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:45.195 [2024-07-25 07:13:17.536828] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:45.195 Passthru0 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:45.195 { 00:07:45.195 "name": "Malloc0", 00:07:45.195 "aliases": [ 00:07:45.195 "fdebcd26-0ebb-4fac-8494-9d06fcb1b488" 00:07:45.195 ], 00:07:45.195 "product_name": "Malloc disk", 00:07:45.195 "block_size": 512, 00:07:45.195 "num_blocks": 16384, 00:07:45.195 "uuid": "fdebcd26-0ebb-4fac-8494-9d06fcb1b488", 00:07:45.195 "assigned_rate_limits": { 00:07:45.195 "rw_ios_per_sec": 0, 00:07:45.195 "rw_mbytes_per_sec": 0, 00:07:45.195 "r_mbytes_per_sec": 0, 00:07:45.195 "w_mbytes_per_sec": 0 00:07:45.195 }, 00:07:45.195 "claimed": true, 00:07:45.195 "claim_type": "exclusive_write", 00:07:45.195 "zoned": false, 00:07:45.195 "supported_io_types": { 00:07:45.195 "read": true, 00:07:45.195 "write": true, 00:07:45.195 "unmap": true, 00:07:45.195 "flush": true, 00:07:45.195 "reset": true, 00:07:45.195 "nvme_admin": false, 00:07:45.195 "nvme_io": false, 00:07:45.195 "nvme_io_md": false, 00:07:45.195 "write_zeroes": true, 00:07:45.195 "zcopy": true, 00:07:45.195 "get_zone_info": false, 00:07:45.195 "zone_management": false, 00:07:45.195 "zone_append": false, 00:07:45.195 "compare": false, 00:07:45.195 "compare_and_write": false, 00:07:45.195 "abort": true, 00:07:45.195 "seek_hole": false, 00:07:45.195 "seek_data": false, 00:07:45.195 "copy": true, 00:07:45.195 "nvme_iov_md": false 00:07:45.195 }, 00:07:45.195 "memory_domains": [ 00:07:45.195 { 00:07:45.195 "dma_device_id": "system", 00:07:45.195 "dma_device_type": 1 00:07:45.195 }, 00:07:45.195 { 00:07:45.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:45.195 "dma_device_type": 2 00:07:45.195 } 00:07:45.195 ], 00:07:45.195 "driver_specific": {} 00:07:45.195 }, 00:07:45.195 { 00:07:45.195 "name": "Passthru0", 00:07:45.195 "aliases": [ 00:07:45.195 "bcca1a25-81b7-51eb-9be4-4ab4842fa795" 00:07:45.195 ], 00:07:45.195 "product_name": "passthru", 00:07:45.195 "block_size": 512, 00:07:45.195 "num_blocks": 16384, 00:07:45.195 "uuid": "bcca1a25-81b7-51eb-9be4-4ab4842fa795", 00:07:45.195 "assigned_rate_limits": { 00:07:45.195 "rw_ios_per_sec": 0, 00:07:45.195 "rw_mbytes_per_sec": 0, 00:07:45.195 "r_mbytes_per_sec": 0, 00:07:45.195 "w_mbytes_per_sec": 0 00:07:45.195 }, 00:07:45.195 "claimed": false, 00:07:45.195 "zoned": false, 00:07:45.195 "supported_io_types": { 00:07:45.195 "read": true, 00:07:45.195 "write": true, 00:07:45.195 "unmap": true, 00:07:45.195 "flush": true, 00:07:45.195 "reset": true, 00:07:45.195 "nvme_admin": false, 00:07:45.195 "nvme_io": false, 00:07:45.195 "nvme_io_md": false, 00:07:45.195 "write_zeroes": true, 00:07:45.195 "zcopy": true, 00:07:45.195 "get_zone_info": false, 00:07:45.195 "zone_management": false, 00:07:45.195 "zone_append": false, 00:07:45.195 "compare": false, 00:07:45.195 "compare_and_write": false, 00:07:45.195 "abort": true, 00:07:45.195 "seek_hole": false, 00:07:45.195 "seek_data": false, 00:07:45.195 "copy": true, 00:07:45.195 "nvme_iov_md": false 00:07:45.195 }, 00:07:45.195 "memory_domains": [ 00:07:45.195 { 00:07:45.195 "dma_device_id": "system", 00:07:45.195 "dma_device_type": 1 00:07:45.195 }, 00:07:45.195 { 00:07:45.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:45.195 "dma_device_type": 2 00:07:45.195 } 00:07:45.195 ], 00:07:45.195 "driver_specific": { 00:07:45.195 "passthru": { 00:07:45.195 "name": "Passthru0", 00:07:45.195 "base_bdev_name": "Malloc0" 00:07:45.195 } 00:07:45.195 } 00:07:45.195 } 00:07:45.195 ]' 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.195 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.195 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.196 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:45.196 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:45.196 07:13:17 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:45.196 00:07:45.196 real 0m0.267s 00:07:45.196 user 0m0.169s 00:07:45.196 sys 0m0.039s 00:07:45.196 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:45.196 07:13:17 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.196 ************************************ 00:07:45.196 END TEST rpc_integrity 00:07:45.196 ************************************ 00:07:45.196 07:13:17 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:45.196 07:13:17 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:45.196 07:13:17 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:45.196 07:13:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 ************************************ 00:07:45.455 START TEST rpc_plugins 00:07:45.455 ************************************ 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:45.455 { 00:07:45.455 "name": "Malloc1", 00:07:45.455 "aliases": [ 00:07:45.455 "cba12302-d99a-4723-be31-ed4e19d65000" 00:07:45.455 ], 00:07:45.455 "product_name": "Malloc disk", 00:07:45.455 "block_size": 4096, 00:07:45.455 "num_blocks": 256, 00:07:45.455 "uuid": "cba12302-d99a-4723-be31-ed4e19d65000", 00:07:45.455 "assigned_rate_limits": { 00:07:45.455 "rw_ios_per_sec": 0, 00:07:45.455 "rw_mbytes_per_sec": 0, 00:07:45.455 "r_mbytes_per_sec": 0, 00:07:45.455 "w_mbytes_per_sec": 0 00:07:45.455 }, 00:07:45.455 "claimed": false, 00:07:45.455 "zoned": false, 00:07:45.455 "supported_io_types": { 00:07:45.455 "read": true, 00:07:45.455 "write": true, 00:07:45.455 "unmap": true, 00:07:45.455 "flush": true, 00:07:45.455 "reset": true, 00:07:45.455 "nvme_admin": false, 00:07:45.455 "nvme_io": false, 00:07:45.455 "nvme_io_md": false, 00:07:45.455 "write_zeroes": true, 00:07:45.455 "zcopy": true, 00:07:45.455 "get_zone_info": false, 00:07:45.455 "zone_management": false, 00:07:45.455 "zone_append": false, 00:07:45.455 "compare": false, 00:07:45.455 "compare_and_write": false, 00:07:45.455 "abort": true, 00:07:45.455 "seek_hole": false, 00:07:45.455 "seek_data": false, 00:07:45.455 "copy": true, 00:07:45.455 "nvme_iov_md": false 00:07:45.455 }, 00:07:45.455 "memory_domains": [ 00:07:45.455 { 00:07:45.455 "dma_device_id": "system", 00:07:45.455 "dma_device_type": 1 00:07:45.455 }, 00:07:45.455 { 00:07:45.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:45.455 "dma_device_type": 2 00:07:45.455 } 00:07:45.455 ], 00:07:45.455 "driver_specific": {} 00:07:45.455 } 00:07:45.455 ]' 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:07:45.455 07:13:17 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:45.455 00:07:45.455 real 0m0.146s 00:07:45.455 user 0m0.090s 00:07:45.455 sys 0m0.022s 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:45.455 07:13:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 ************************************ 00:07:45.455 END TEST rpc_plugins 00:07:45.455 ************************************ 00:07:45.455 07:13:17 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:45.455 07:13:17 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:45.455 07:13:17 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:45.455 07:13:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 ************************************ 00:07:45.455 START TEST rpc_trace_cmd_test 00:07:45.455 ************************************ 00:07:45.455 07:13:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:07:45.455 07:13:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:07:45.455 07:13:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:45.455 07:13:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.455 07:13:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:07:45.715 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1534376", 00:07:45.715 "tpoint_group_mask": "0x8", 00:07:45.715 "iscsi_conn": { 00:07:45.715 "mask": "0x2", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "scsi": { 00:07:45.715 "mask": "0x4", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "bdev": { 00:07:45.715 "mask": "0x8", 00:07:45.715 "tpoint_mask": "0xffffffffffffffff" 00:07:45.715 }, 00:07:45.715 "nvmf_rdma": { 00:07:45.715 "mask": "0x10", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "nvmf_tcp": { 00:07:45.715 "mask": "0x20", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "ftl": { 00:07:45.715 "mask": "0x40", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "blobfs": { 00:07:45.715 "mask": "0x80", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "dsa": { 00:07:45.715 "mask": "0x200", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "thread": { 00:07:45.715 "mask": "0x400", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "nvme_pcie": { 00:07:45.715 "mask": "0x800", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "iaa": { 00:07:45.715 "mask": "0x1000", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "nvme_tcp": { 00:07:45.715 "mask": "0x2000", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "bdev_nvme": { 00:07:45.715 "mask": "0x4000", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 }, 00:07:45.715 "sock": { 00:07:45.715 "mask": "0x8000", 00:07:45.715 "tpoint_mask": "0x0" 00:07:45.715 } 00:07:45.715 }' 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:45.715 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:45.975 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:45.975 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:45.975 07:13:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:45.975 00:07:45.975 real 0m0.312s 00:07:45.975 user 0m0.272s 00:07:45.975 sys 0m0.031s 00:07:45.975 07:13:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:45.975 07:13:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:45.975 ************************************ 00:07:45.975 END TEST rpc_trace_cmd_test 00:07:45.975 ************************************ 00:07:45.975 07:13:18 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:45.975 07:13:18 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:45.975 07:13:18 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:45.975 07:13:18 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:45.975 07:13:18 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:45.975 07:13:18 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:45.975 ************************************ 00:07:45.975 START TEST rpc_daemon_integrity 00:07:45.975 ************************************ 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:45.975 { 00:07:45.975 "name": "Malloc2", 00:07:45.975 "aliases": [ 00:07:45.975 "54848cca-b033-4604-ad5d-c26d877af1ed" 00:07:45.975 ], 00:07:45.975 "product_name": "Malloc disk", 00:07:45.975 "block_size": 512, 00:07:45.975 "num_blocks": 16384, 00:07:45.975 "uuid": "54848cca-b033-4604-ad5d-c26d877af1ed", 00:07:45.975 "assigned_rate_limits": { 00:07:45.975 "rw_ios_per_sec": 0, 00:07:45.975 "rw_mbytes_per_sec": 0, 00:07:45.975 "r_mbytes_per_sec": 0, 00:07:45.975 "w_mbytes_per_sec": 0 00:07:45.975 }, 00:07:45.975 "claimed": false, 00:07:45.975 "zoned": false, 00:07:45.975 "supported_io_types": { 00:07:45.975 "read": true, 00:07:45.975 "write": true, 00:07:45.975 "unmap": true, 00:07:45.975 "flush": true, 00:07:45.975 "reset": true, 00:07:45.975 "nvme_admin": false, 00:07:45.975 "nvme_io": false, 00:07:45.975 "nvme_io_md": false, 00:07:45.975 "write_zeroes": true, 00:07:45.975 "zcopy": true, 00:07:45.975 "get_zone_info": false, 00:07:45.975 "zone_management": false, 00:07:45.975 "zone_append": false, 00:07:45.975 "compare": false, 00:07:45.975 "compare_and_write": false, 00:07:45.975 "abort": true, 00:07:45.975 "seek_hole": false, 00:07:45.975 "seek_data": false, 00:07:45.975 "copy": true, 00:07:45.975 "nvme_iov_md": false 00:07:45.975 }, 00:07:45.975 "memory_domains": [ 00:07:45.975 { 00:07:45.975 "dma_device_id": "system", 00:07:45.975 "dma_device_type": 1 00:07:45.975 }, 00:07:45.975 { 00:07:45.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:45.975 "dma_device_type": 2 00:07:45.975 } 00:07:45.975 ], 00:07:45.975 "driver_specific": {} 00:07:45.975 } 00:07:45.975 ]' 00:07:45.975 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:46.235 [2024-07-25 07:13:18.518051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:46.235 [2024-07-25 07:13:18.518085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:46.235 [2024-07-25 07:13:18.518102] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2702d60 00:07:46.235 [2024-07-25 07:13:18.518113] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:46.235 [2024-07-25 07:13:18.519364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:46.235 [2024-07-25 07:13:18.519391] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:46.235 Passthru0 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.235 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:46.235 { 00:07:46.235 "name": "Malloc2", 00:07:46.235 "aliases": [ 00:07:46.235 "54848cca-b033-4604-ad5d-c26d877af1ed" 00:07:46.235 ], 00:07:46.235 "product_name": "Malloc disk", 00:07:46.235 "block_size": 512, 00:07:46.235 "num_blocks": 16384, 00:07:46.235 "uuid": "54848cca-b033-4604-ad5d-c26d877af1ed", 00:07:46.235 "assigned_rate_limits": { 00:07:46.235 "rw_ios_per_sec": 0, 00:07:46.235 "rw_mbytes_per_sec": 0, 00:07:46.235 "r_mbytes_per_sec": 0, 00:07:46.235 "w_mbytes_per_sec": 0 00:07:46.235 }, 00:07:46.235 "claimed": true, 00:07:46.235 "claim_type": "exclusive_write", 00:07:46.235 "zoned": false, 00:07:46.235 "supported_io_types": { 00:07:46.235 "read": true, 00:07:46.235 "write": true, 00:07:46.235 "unmap": true, 00:07:46.235 "flush": true, 00:07:46.235 "reset": true, 00:07:46.235 "nvme_admin": false, 00:07:46.235 "nvme_io": false, 00:07:46.235 "nvme_io_md": false, 00:07:46.235 "write_zeroes": true, 00:07:46.235 "zcopy": true, 00:07:46.235 "get_zone_info": false, 00:07:46.235 "zone_management": false, 00:07:46.235 "zone_append": false, 00:07:46.235 "compare": false, 00:07:46.235 "compare_and_write": false, 00:07:46.235 "abort": true, 00:07:46.235 "seek_hole": false, 00:07:46.235 "seek_data": false, 00:07:46.235 "copy": true, 00:07:46.235 "nvme_iov_md": false 00:07:46.235 }, 00:07:46.235 "memory_domains": [ 00:07:46.235 { 00:07:46.235 "dma_device_id": "system", 00:07:46.235 "dma_device_type": 1 00:07:46.235 }, 00:07:46.235 { 00:07:46.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:46.235 "dma_device_type": 2 00:07:46.235 } 00:07:46.235 ], 00:07:46.235 "driver_specific": {} 00:07:46.235 }, 00:07:46.235 { 00:07:46.235 "name": "Passthru0", 00:07:46.235 "aliases": [ 00:07:46.235 "ec23b919-b295-5cab-90c6-db278a0fc184" 00:07:46.235 ], 00:07:46.235 "product_name": "passthru", 00:07:46.236 "block_size": 512, 00:07:46.236 "num_blocks": 16384, 00:07:46.236 "uuid": "ec23b919-b295-5cab-90c6-db278a0fc184", 00:07:46.236 "assigned_rate_limits": { 00:07:46.236 "rw_ios_per_sec": 0, 00:07:46.236 "rw_mbytes_per_sec": 0, 00:07:46.236 "r_mbytes_per_sec": 0, 00:07:46.236 "w_mbytes_per_sec": 0 00:07:46.236 }, 00:07:46.236 "claimed": false, 00:07:46.236 "zoned": false, 00:07:46.236 "supported_io_types": { 00:07:46.236 "read": true, 00:07:46.236 "write": true, 00:07:46.236 "unmap": true, 00:07:46.236 "flush": true, 00:07:46.236 "reset": true, 00:07:46.236 "nvme_admin": false, 00:07:46.236 "nvme_io": false, 00:07:46.236 "nvme_io_md": false, 00:07:46.236 "write_zeroes": true, 00:07:46.236 "zcopy": true, 00:07:46.236 "get_zone_info": false, 00:07:46.236 "zone_management": false, 00:07:46.236 "zone_append": false, 00:07:46.236 "compare": false, 00:07:46.236 "compare_and_write": false, 00:07:46.236 "abort": true, 00:07:46.236 "seek_hole": false, 00:07:46.236 "seek_data": false, 00:07:46.236 "copy": true, 00:07:46.236 "nvme_iov_md": false 00:07:46.236 }, 00:07:46.236 "memory_domains": [ 00:07:46.236 { 00:07:46.236 "dma_device_id": "system", 00:07:46.236 "dma_device_type": 1 00:07:46.236 }, 00:07:46.236 { 00:07:46.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:46.236 "dma_device_type": 2 00:07:46.236 } 00:07:46.236 ], 00:07:46.236 "driver_specific": { 00:07:46.236 "passthru": { 00:07:46.236 "name": "Passthru0", 00:07:46.236 "base_bdev_name": "Malloc2" 00:07:46.236 } 00:07:46.236 } 00:07:46.236 } 00:07:46.236 ]' 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:46.236 00:07:46.236 real 0m0.321s 00:07:46.236 user 0m0.204s 00:07:46.236 sys 0m0.052s 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:46.236 07:13:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:46.236 ************************************ 00:07:46.236 END TEST rpc_daemon_integrity 00:07:46.236 ************************************ 00:07:46.236 07:13:18 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:46.236 07:13:18 rpc -- rpc/rpc.sh@84 -- # killprocess 1534376 00:07:46.236 07:13:18 rpc -- common/autotest_common.sh@950 -- # '[' -z 1534376 ']' 00:07:46.236 07:13:18 rpc -- common/autotest_common.sh@954 -- # kill -0 1534376 00:07:46.236 07:13:18 rpc -- common/autotest_common.sh@955 -- # uname 00:07:46.236 07:13:18 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:46.236 07:13:18 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1534376 00:07:46.495 07:13:18 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:46.495 07:13:18 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:46.496 07:13:18 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1534376' 00:07:46.496 killing process with pid 1534376 00:07:46.496 07:13:18 rpc -- common/autotest_common.sh@969 -- # kill 1534376 00:07:46.496 07:13:18 rpc -- common/autotest_common.sh@974 -- # wait 1534376 00:07:46.756 00:07:46.756 real 0m2.641s 00:07:46.756 user 0m3.330s 00:07:46.756 sys 0m0.837s 00:07:46.756 07:13:19 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:46.756 07:13:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:46.756 ************************************ 00:07:46.756 END TEST rpc 00:07:46.756 ************************************ 00:07:46.756 07:13:19 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:46.756 07:13:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:46.756 07:13:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:46.756 07:13:19 -- common/autotest_common.sh@10 -- # set +x 00:07:46.756 ************************************ 00:07:46.756 START TEST skip_rpc 00:07:46.756 ************************************ 00:07:46.756 07:13:19 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:46.756 * Looking for test storage... 00:07:47.016 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:47.016 07:13:19 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:47.016 07:13:19 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:47.016 07:13:19 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:07:47.016 07:13:19 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:47.016 07:13:19 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:47.016 07:13:19 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.016 ************************************ 00:07:47.016 START TEST skip_rpc 00:07:47.016 ************************************ 00:07:47.016 07:13:19 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:07:47.016 07:13:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1535067 00:07:47.016 07:13:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:47.016 07:13:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:07:47.016 07:13:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:07:47.016 [2024-07-25 07:13:19.396774] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:07:47.016 [2024-07-25 07:13:19.396829] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1535067 ] 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.016 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:47.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:47.017 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:47.017 [2024-07-25 07:13:19.529633] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.276 [2024-07-25 07:13:19.613007] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1535067 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 1535067 ']' 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 1535067 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1535067 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1535067' 00:07:52.574 killing process with pid 1535067 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 1535067 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 1535067 00:07:52.574 00:07:52.574 real 0m5.428s 00:07:52.574 user 0m5.107s 00:07:52.574 sys 0m0.374s 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:52.574 07:13:24 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.574 ************************************ 00:07:52.574 END TEST skip_rpc 00:07:52.574 ************************************ 00:07:52.574 07:13:24 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:52.574 07:13:24 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:52.574 07:13:24 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:52.574 07:13:24 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.574 ************************************ 00:07:52.574 START TEST skip_rpc_with_json 00:07:52.574 ************************************ 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1536049 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1536049 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 1536049 ']' 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:52.574 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:52.574 07:13:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:52.574 [2024-07-25 07:13:24.901300] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:07:52.574 [2024-07-25 07:13:24.901355] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1536049 ] 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:52.574 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.574 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:52.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:52.575 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:52.575 [2024-07-25 07:13:25.034094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.834 [2024-07-25 07:13:25.121427] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:53.403 [2024-07-25 07:13:25.798475] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:53.403 request: 00:07:53.403 { 00:07:53.403 "trtype": "tcp", 00:07:53.403 "method": "nvmf_get_transports", 00:07:53.403 "req_id": 1 00:07:53.403 } 00:07:53.403 Got JSON-RPC error response 00:07:53.403 response: 00:07:53.403 { 00:07:53.403 "code": -19, 00:07:53.403 "message": "No such device" 00:07:53.403 } 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:53.403 [2024-07-25 07:13:25.810619] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:53.403 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:53.663 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:53.663 07:13:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:53.663 { 00:07:53.663 "subsystems": [ 00:07:53.663 { 00:07:53.663 "subsystem": "keyring", 00:07:53.663 "config": [] 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "subsystem": "iobuf", 00:07:53.663 "config": [ 00:07:53.663 { 00:07:53.663 "method": "iobuf_set_options", 00:07:53.663 "params": { 00:07:53.663 "small_pool_count": 8192, 00:07:53.663 "large_pool_count": 1024, 00:07:53.663 "small_bufsize": 8192, 00:07:53.663 "large_bufsize": 135168 00:07:53.663 } 00:07:53.663 } 00:07:53.663 ] 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "subsystem": "sock", 00:07:53.663 "config": [ 00:07:53.663 { 00:07:53.663 "method": "sock_set_default_impl", 00:07:53.663 "params": { 00:07:53.663 "impl_name": "posix" 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "sock_impl_set_options", 00:07:53.663 "params": { 00:07:53.663 "impl_name": "ssl", 00:07:53.663 "recv_buf_size": 4096, 00:07:53.663 "send_buf_size": 4096, 00:07:53.663 "enable_recv_pipe": true, 00:07:53.663 "enable_quickack": false, 00:07:53.663 "enable_placement_id": 0, 00:07:53.663 "enable_zerocopy_send_server": true, 00:07:53.663 "enable_zerocopy_send_client": false, 00:07:53.663 "zerocopy_threshold": 0, 00:07:53.663 "tls_version": 0, 00:07:53.663 "enable_ktls": false 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "sock_impl_set_options", 00:07:53.663 "params": { 00:07:53.663 "impl_name": "posix", 00:07:53.663 "recv_buf_size": 2097152, 00:07:53.663 "send_buf_size": 2097152, 00:07:53.663 "enable_recv_pipe": true, 00:07:53.663 "enable_quickack": false, 00:07:53.663 "enable_placement_id": 0, 00:07:53.663 "enable_zerocopy_send_server": true, 00:07:53.663 "enable_zerocopy_send_client": false, 00:07:53.663 "zerocopy_threshold": 0, 00:07:53.663 "tls_version": 0, 00:07:53.663 "enable_ktls": false 00:07:53.663 } 00:07:53.663 } 00:07:53.663 ] 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "subsystem": "vmd", 00:07:53.663 "config": [] 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "subsystem": "accel", 00:07:53.663 "config": [ 00:07:53.663 { 00:07:53.663 "method": "accel_set_options", 00:07:53.663 "params": { 00:07:53.663 "small_cache_size": 128, 00:07:53.663 "large_cache_size": 16, 00:07:53.663 "task_count": 2048, 00:07:53.663 "sequence_count": 2048, 00:07:53.663 "buf_count": 2048 00:07:53.663 } 00:07:53.663 } 00:07:53.663 ] 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "subsystem": "bdev", 00:07:53.663 "config": [ 00:07:53.663 { 00:07:53.663 "method": "bdev_set_options", 00:07:53.663 "params": { 00:07:53.663 "bdev_io_pool_size": 65535, 00:07:53.663 "bdev_io_cache_size": 256, 00:07:53.663 "bdev_auto_examine": true, 00:07:53.663 "iobuf_small_cache_size": 128, 00:07:53.663 "iobuf_large_cache_size": 16 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "bdev_raid_set_options", 00:07:53.663 "params": { 00:07:53.663 "process_window_size_kb": 1024, 00:07:53.663 "process_max_bandwidth_mb_sec": 0 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "bdev_iscsi_set_options", 00:07:53.663 "params": { 00:07:53.663 "timeout_sec": 30 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "bdev_nvme_set_options", 00:07:53.663 "params": { 00:07:53.663 "action_on_timeout": "none", 00:07:53.663 "timeout_us": 0, 00:07:53.663 "timeout_admin_us": 0, 00:07:53.663 "keep_alive_timeout_ms": 10000, 00:07:53.663 "arbitration_burst": 0, 00:07:53.663 "low_priority_weight": 0, 00:07:53.663 "medium_priority_weight": 0, 00:07:53.663 "high_priority_weight": 0, 00:07:53.663 "nvme_adminq_poll_period_us": 10000, 00:07:53.663 "nvme_ioq_poll_period_us": 0, 00:07:53.663 "io_queue_requests": 0, 00:07:53.663 "delay_cmd_submit": true, 00:07:53.663 "transport_retry_count": 4, 00:07:53.663 "bdev_retry_count": 3, 00:07:53.663 "transport_ack_timeout": 0, 00:07:53.663 "ctrlr_loss_timeout_sec": 0, 00:07:53.663 "reconnect_delay_sec": 0, 00:07:53.663 "fast_io_fail_timeout_sec": 0, 00:07:53.663 "disable_auto_failback": false, 00:07:53.663 "generate_uuids": false, 00:07:53.663 "transport_tos": 0, 00:07:53.663 "nvme_error_stat": false, 00:07:53.663 "rdma_srq_size": 0, 00:07:53.663 "io_path_stat": false, 00:07:53.663 "allow_accel_sequence": false, 00:07:53.663 "rdma_max_cq_size": 0, 00:07:53.663 "rdma_cm_event_timeout_ms": 0, 00:07:53.663 "dhchap_digests": [ 00:07:53.663 "sha256", 00:07:53.663 "sha384", 00:07:53.663 "sha512" 00:07:53.663 ], 00:07:53.663 "dhchap_dhgroups": [ 00:07:53.663 "null", 00:07:53.663 "ffdhe2048", 00:07:53.663 "ffdhe3072", 00:07:53.663 "ffdhe4096", 00:07:53.663 "ffdhe6144", 00:07:53.663 "ffdhe8192" 00:07:53.663 ] 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "bdev_nvme_set_hotplug", 00:07:53.663 "params": { 00:07:53.663 "period_us": 100000, 00:07:53.663 "enable": false 00:07:53.663 } 00:07:53.663 }, 00:07:53.663 { 00:07:53.663 "method": "bdev_wait_for_examine" 00:07:53.663 } 00:07:53.663 ] 00:07:53.663 }, 00:07:53.664 { 00:07:53.664 "subsystem": "scsi", 00:07:53.664 "config": null 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "scheduler", 00:07:53.664 "config": [ 00:07:53.664 { 00:07:53.664 "method": "framework_set_scheduler", 00:07:53.664 "params": { 00:07:53.664 "name": "static" 00:07:53.664 } 00:07:53.664 } 00:07:53.664 ] 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "vhost_scsi", 00:07:53.664 "config": [] 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "vhost_blk", 00:07:53.664 "config": [] 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "ublk", 00:07:53.664 "config": [] 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "nbd", 00:07:53.664 "config": [] 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "nvmf", 00:07:53.664 "config": [ 00:07:53.664 { 00:07:53.664 "method": "nvmf_set_config", 00:07:53.664 "params": { 00:07:53.664 "discovery_filter": "match_any", 00:07:53.664 "admin_cmd_passthru": { 00:07:53.664 "identify_ctrlr": false 00:07:53.664 } 00:07:53.664 } 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "method": "nvmf_set_max_subsystems", 00:07:53.664 "params": { 00:07:53.664 "max_subsystems": 1024 00:07:53.664 } 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "method": "nvmf_set_crdt", 00:07:53.664 "params": { 00:07:53.664 "crdt1": 0, 00:07:53.664 "crdt2": 0, 00:07:53.664 "crdt3": 0 00:07:53.664 } 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "method": "nvmf_create_transport", 00:07:53.664 "params": { 00:07:53.664 "trtype": "TCP", 00:07:53.664 "max_queue_depth": 128, 00:07:53.664 "max_io_qpairs_per_ctrlr": 127, 00:07:53.664 "in_capsule_data_size": 4096, 00:07:53.664 "max_io_size": 131072, 00:07:53.664 "io_unit_size": 131072, 00:07:53.664 "max_aq_depth": 128, 00:07:53.664 "num_shared_buffers": 511, 00:07:53.664 "buf_cache_size": 4294967295, 00:07:53.664 "dif_insert_or_strip": false, 00:07:53.664 "zcopy": false, 00:07:53.664 "c2h_success": true, 00:07:53.664 "sock_priority": 0, 00:07:53.664 "abort_timeout_sec": 1, 00:07:53.664 "ack_timeout": 0, 00:07:53.664 "data_wr_pool_size": 0 00:07:53.664 } 00:07:53.664 } 00:07:53.664 ] 00:07:53.664 }, 00:07:53.664 { 00:07:53.664 "subsystem": "iscsi", 00:07:53.664 "config": [ 00:07:53.664 { 00:07:53.664 "method": "iscsi_set_options", 00:07:53.664 "params": { 00:07:53.664 "node_base": "iqn.2016-06.io.spdk", 00:07:53.664 "max_sessions": 128, 00:07:53.664 "max_connections_per_session": 2, 00:07:53.664 "max_queue_depth": 64, 00:07:53.664 "default_time2wait": 2, 00:07:53.664 "default_time2retain": 20, 00:07:53.664 "first_burst_length": 8192, 00:07:53.664 "immediate_data": true, 00:07:53.664 "allow_duplicated_isid": false, 00:07:53.664 "error_recovery_level": 0, 00:07:53.664 "nop_timeout": 60, 00:07:53.664 "nop_in_interval": 30, 00:07:53.664 "disable_chap": false, 00:07:53.664 "require_chap": false, 00:07:53.664 "mutual_chap": false, 00:07:53.664 "chap_group": 0, 00:07:53.664 "max_large_datain_per_connection": 64, 00:07:53.664 "max_r2t_per_connection": 4, 00:07:53.664 "pdu_pool_size": 36864, 00:07:53.664 "immediate_data_pool_size": 16384, 00:07:53.664 "data_out_pool_size": 2048 00:07:53.664 } 00:07:53.664 } 00:07:53.664 ] 00:07:53.664 } 00:07:53.664 ] 00:07:53.664 } 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1536049 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1536049 ']' 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1536049 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:53.664 07:13:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1536049 00:07:53.664 07:13:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:53.664 07:13:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:53.664 07:13:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1536049' 00:07:53.664 killing process with pid 1536049 00:07:53.664 07:13:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1536049 00:07:53.664 07:13:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1536049 00:07:53.923 07:13:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1536309 00:07:53.923 07:13:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:53.923 07:13:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1536309 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1536309 ']' 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1536309 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1536309 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1536309' 00:07:59.195 killing process with pid 1536309 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1536309 00:07:59.195 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1536309 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:59.454 00:07:59.454 real 0m6.959s 00:07:59.454 user 0m6.740s 00:07:59.454 sys 0m0.785s 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:59.454 ************************************ 00:07:59.454 END TEST skip_rpc_with_json 00:07:59.454 ************************************ 00:07:59.454 07:13:31 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:59.454 07:13:31 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:59.454 07:13:31 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:59.454 07:13:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:59.454 ************************************ 00:07:59.454 START TEST skip_rpc_with_delay 00:07:59.454 ************************************ 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:59.454 [2024-07-25 07:13:31.944557] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:59.454 [2024-07-25 07:13:31.944647] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:59.454 00:07:59.454 real 0m0.091s 00:07:59.454 user 0m0.050s 00:07:59.454 sys 0m0.040s 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:59.454 07:13:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:59.454 ************************************ 00:07:59.454 END TEST skip_rpc_with_delay 00:07:59.454 ************************************ 00:07:59.713 07:13:32 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:59.713 07:13:32 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:59.713 07:13:32 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:59.713 07:13:32 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:59.713 07:13:32 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:59.713 07:13:32 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:59.713 ************************************ 00:07:59.713 START TEST exit_on_failed_rpc_init 00:07:59.713 ************************************ 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1537278 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1537278 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 1537278 ']' 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:59.713 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:59.713 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:59.713 [2024-07-25 07:13:32.098298] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:07:59.713 [2024-07-25 07:13:32.098354] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1537278 ] 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.0 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.1 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.2 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.3 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.4 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.5 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.6 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:01.7 cannot be used 00:07:59.713 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.713 EAL: Requested device 0000:3d:02.0 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.1 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.2 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.3 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.4 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.5 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.6 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3d:02.7 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.0 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.1 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.2 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.3 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.4 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.5 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.6 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:01.7 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.0 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.1 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.2 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.3 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.4 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.5 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.6 cannot be used 00:07:59.714 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:07:59.714 EAL: Requested device 0000:3f:02.7 cannot be used 00:07:59.714 [2024-07-25 07:13:32.217737] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.973 [2024-07-25 07:13:32.304981] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:08:00.540 07:13:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:00.540 [2024-07-25 07:13:33.029912] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:00.540 [2024-07-25 07:13:33.029962] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1537538 ] 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:00.799 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.799 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:00.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:00.800 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:00.800 [2024-07-25 07:13:33.135273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.800 [2024-07-25 07:13:33.216345] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:00.800 [2024-07-25 07:13:33.216438] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:08:00.800 [2024-07-25 07:13:33.216454] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:08:00.800 [2024-07-25 07:13:33.216465] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1537278 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 1537278 ']' 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 1537278 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:00.800 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1537278 00:08:01.058 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:01.058 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:01.058 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1537278' 00:08:01.058 killing process with pid 1537278 00:08:01.058 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 1537278 00:08:01.058 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 1537278 00:08:01.317 00:08:01.317 real 0m1.655s 00:08:01.317 user 0m1.905s 00:08:01.317 sys 0m0.528s 00:08:01.317 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:01.317 07:13:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:08:01.317 ************************************ 00:08:01.317 END TEST exit_on_failed_rpc_init 00:08:01.317 ************************************ 00:08:01.317 07:13:33 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:01.317 00:08:01.317 real 0m14.541s 00:08:01.317 user 0m13.963s 00:08:01.317 sys 0m2.007s 00:08:01.317 07:13:33 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:01.317 07:13:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:01.317 ************************************ 00:08:01.317 END TEST skip_rpc 00:08:01.317 ************************************ 00:08:01.317 07:13:33 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:08:01.317 07:13:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:01.317 07:13:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:01.318 07:13:33 -- common/autotest_common.sh@10 -- # set +x 00:08:01.318 ************************************ 00:08:01.318 START TEST rpc_client 00:08:01.318 ************************************ 00:08:01.318 07:13:33 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:08:01.577 * Looking for test storage... 00:08:01.577 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:08:01.577 07:13:33 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:08:01.577 OK 00:08:01.577 07:13:33 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:08:01.577 00:08:01.577 real 0m0.133s 00:08:01.577 user 0m0.062s 00:08:01.577 sys 0m0.082s 00:08:01.577 07:13:33 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:01.577 07:13:33 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:08:01.577 ************************************ 00:08:01.577 END TEST rpc_client 00:08:01.577 ************************************ 00:08:01.577 07:13:33 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:08:01.577 07:13:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:01.577 07:13:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:01.577 07:13:33 -- common/autotest_common.sh@10 -- # set +x 00:08:01.577 ************************************ 00:08:01.577 START TEST json_config 00:08:01.577 ************************************ 00:08:01.577 07:13:34 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:08:01.577 07:13:34 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:01.577 07:13:34 json_config -- nvmf/common.sh@7 -- # uname -s 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:01.836 07:13:34 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:01.836 07:13:34 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:01.836 07:13:34 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:01.836 07:13:34 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:01.836 07:13:34 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:01.837 07:13:34 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:01.837 07:13:34 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:01.837 07:13:34 json_config -- paths/export.sh@5 -- # export PATH 00:08:01.837 07:13:34 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@47 -- # : 0 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:01.837 07:13:34 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:08:01.837 INFO: JSON configuration test init 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:01.837 07:13:34 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:08:01.837 07:13:34 json_config -- json_config/common.sh@9 -- # local app=target 00:08:01.837 07:13:34 json_config -- json_config/common.sh@10 -- # shift 00:08:01.837 07:13:34 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:01.837 07:13:34 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:01.837 07:13:34 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:08:01.837 07:13:34 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:01.837 07:13:34 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:01.837 07:13:34 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1537826 00:08:01.837 07:13:34 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:01.837 Waiting for target to run... 00:08:01.837 07:13:34 json_config -- json_config/common.sh@25 -- # waitforlisten 1537826 /var/tmp/spdk_tgt.sock 00:08:01.837 07:13:34 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@831 -- # '[' -z 1537826 ']' 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:01.837 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:01.837 07:13:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:01.837 [2024-07-25 07:13:34.224291] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:01.837 [2024-07-25 07:13:34.224354] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1537826 ] 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:02.096 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.096 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:02.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:02.097 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:02.097 [2024-07-25 07:13:34.584413] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.355 [2024-07-25 07:13:34.661580] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.614 07:13:35 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:02.614 07:13:35 json_config -- common/autotest_common.sh@864 -- # return 0 00:08:02.614 07:13:35 json_config -- json_config/common.sh@26 -- # echo '' 00:08:02.614 00:08:02.614 07:13:35 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:08:02.614 07:13:35 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:08:02.614 07:13:35 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:02.614 07:13:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:02.614 07:13:35 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:08:02.614 07:13:35 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:08:02.614 07:13:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:08:02.873 07:13:35 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:08:02.873 07:13:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:08:03.131 [2024-07-25 07:13:35.552231] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:08:03.131 07:13:35 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:08:03.131 07:13:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:08:03.390 [2024-07-25 07:13:35.780816] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:08:03.390 07:13:35 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:08:03.390 07:13:35 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:03.390 07:13:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.390 07:13:35 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:08:03.390 07:13:35 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:08:03.390 07:13:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:08:03.648 [2024-07-25 07:13:36.074216] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:08:08.916 07:13:41 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:08:08.916 07:13:41 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:08:08.917 07:13:41 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:08.917 07:13:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:08:08.917 07:13:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@48 -- # local get_types 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@51 -- # sort 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:08:08.917 07:13:41 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:08.917 07:13:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@59 -- # return 0 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:08:08.917 07:13:41 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:08.917 07:13:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:08:08.917 07:13:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:08:08.917 07:13:41 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:08:09.176 07:13:41 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:08:09.176 07:13:41 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:09.176 07:13:41 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:09.176 07:13:41 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:08:09.176 07:13:41 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:08:09.176 07:13:41 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:08:09.176 07:13:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:08:09.436 Nvme0n1p0 Nvme0n1p1 00:08:09.436 07:13:41 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:08:09.436 07:13:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:08:09.694 [2024-07-25 07:13:42.091020] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:09.694 [2024-07-25 07:13:42.091069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:09.694 00:08:09.694 07:13:42 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:08:09.694 07:13:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:08:09.955 Malloc3 00:08:09.955 07:13:42 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:08:09.955 07:13:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:08:10.214 [2024-07-25 07:13:42.532266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:10.214 [2024-07-25 07:13:42.532307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:10.214 [2024-07-25 07:13:42.532324] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd5ff40 00:08:10.214 [2024-07-25 07:13:42.532336] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:10.214 [2024-07-25 07:13:42.533725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:10.214 [2024-07-25 07:13:42.533752] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:08:10.214 PTBdevFromMalloc3 00:08:10.214 07:13:42 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:08:10.214 07:13:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:08:10.472 Null0 00:08:10.472 07:13:42 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:08:10.472 07:13:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:08:10.472 Malloc0 00:08:10.473 07:13:43 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:08:10.473 07:13:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:08:10.731 Malloc1 00:08:10.731 07:13:43 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:08:10.731 07:13:43 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:08:10.989 102400+0 records in 00:08:10.989 102400+0 records out 00:08:10.989 104857600 bytes (105 MB, 100 MiB) copied, 0.271026 s, 387 MB/s 00:08:10.989 07:13:43 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:08:10.989 07:13:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:08:11.248 aio_disk 00:08:11.248 07:13:43 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:08:11.248 07:13:43 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:08:11.248 07:13:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:08:15.432 d596ca77-282e-4b68-8cdb-2a60343a96f9 00:08:15.432 07:13:47 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:08:15.432 07:13:47 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:08:15.432 07:13:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:08:15.690 07:13:48 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:08:15.690 07:13:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:08:15.948 07:13:48 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:08:15.948 07:13:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:08:16.206 07:13:48 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:08:16.206 07:13:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:08:16.464 07:13:48 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:08:16.464 07:13:48 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:08:16.464 07:13:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:08:16.464 MallocForCryptoBdev 00:08:16.464 07:13:48 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:08:16.464 07:13:48 json_config -- json_config/json_config.sh@163 -- # wc -l 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@163 -- # [[ 5 -eq 0 ]] 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:08:16.723 07:13:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:08:16.723 [2024-07-25 07:13:49.224154] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:08:16.723 CryptoMallocBdev 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:f1af9f52-18ac-4b5f-921d-e8a3c3e408d2 bdev_register:13a379bf-722b-4f43-8eaf-e27d437f3b69 bdev_register:b19caddb-a9d9-4329-b232-75bdbb4fc2b8 bdev_register:da83d742-88c8-4b60-970d-453f134cfefa bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:08:16.723 07:13:49 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:f1af9f52-18ac-4b5f-921d-e8a3c3e408d2 bdev_register:13a379bf-722b-4f43-8eaf-e27d437f3b69 bdev_register:b19caddb-a9d9-4329-b232-75bdbb4fc2b8 bdev_register:da83d742-88c8-4b60-970d-453f134cfefa bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@75 -- # sort 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@76 -- # sort 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:08:16.724 07:13:49 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:08:16.724 07:13:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:08:16.983 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:f1af9f52-18ac-4b5f-921d-e8a3c3e408d2 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:13a379bf-722b-4f43-8eaf-e27d437f3b69 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:b19caddb-a9d9-4329-b232-75bdbb4fc2b8 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:da83d742-88c8-4b60-970d-453f134cfefa 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:13a379bf-722b-4f43-8eaf-e27d437f3b69 bdev_register:aio_disk bdev_register:b19caddb-a9d9-4329-b232-75bdbb4fc2b8 bdev_register:CryptoMallocBdev bdev_register:da83d742-88c8-4b60-970d-453f134cfefa bdev_register:f1af9f52-18ac-4b5f-921d-e8a3c3e408d2 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\3\a\3\7\9\b\f\-\7\2\2\b\-\4\f\4\3\-\8\e\a\f\-\e\2\7\d\4\3\7\f\3\b\6\9\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\1\9\c\a\d\d\b\-\a\9\d\9\-\4\3\2\9\-\b\2\3\2\-\7\5\b\d\b\b\4\f\c\2\b\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\a\8\3\d\7\4\2\-\8\8\c\8\-\4\b\6\0\-\9\7\0\d\-\4\5\3\f\1\3\4\c\f\e\f\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\f\1\a\f\9\f\5\2\-\1\8\a\c\-\4\b\5\f\-\9\2\1\d\-\e\8\a\3\c\3\e\4\0\8\d\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@90 -- # cat 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:13a379bf-722b-4f43-8eaf-e27d437f3b69 bdev_register:aio_disk bdev_register:b19caddb-a9d9-4329-b232-75bdbb4fc2b8 bdev_register:CryptoMallocBdev bdev_register:da83d742-88c8-4b60-970d-453f134cfefa bdev_register:f1af9f52-18ac-4b5f-921d-e8a3c3e408d2 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:08:16.984 Expected events matched: 00:08:16.984 bdev_register:13a379bf-722b-4f43-8eaf-e27d437f3b69 00:08:16.984 bdev_register:aio_disk 00:08:16.984 bdev_register:b19caddb-a9d9-4329-b232-75bdbb4fc2b8 00:08:16.984 bdev_register:CryptoMallocBdev 00:08:16.984 bdev_register:da83d742-88c8-4b60-970d-453f134cfefa 00:08:16.984 bdev_register:f1af9f52-18ac-4b5f-921d-e8a3c3e408d2 00:08:16.984 bdev_register:Malloc0 00:08:16.984 bdev_register:Malloc0p0 00:08:16.984 bdev_register:Malloc0p1 00:08:16.984 bdev_register:Malloc0p2 00:08:16.984 bdev_register:Malloc1 00:08:16.984 bdev_register:Malloc3 00:08:16.984 bdev_register:MallocForCryptoBdev 00:08:16.984 bdev_register:Null0 00:08:16.984 bdev_register:Nvme0n1 00:08:16.984 bdev_register:Nvme0n1p0 00:08:16.984 bdev_register:Nvme0n1p1 00:08:16.984 bdev_register:PTBdevFromMalloc3 00:08:16.984 07:13:49 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:08:16.984 07:13:49 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:16.984 07:13:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:17.243 07:13:49 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:08:17.243 07:13:49 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:08:17.243 07:13:49 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:08:17.243 07:13:49 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:08:17.243 07:13:49 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:17.243 07:13:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:17.243 07:13:49 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:08:17.243 07:13:49 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:08:17.243 07:13:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:08:17.502 MallocBdevForConfigChangeCheck 00:08:17.502 07:13:49 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:08:17.502 07:13:49 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:17.502 07:13:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:17.502 07:13:49 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:08:17.502 07:13:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:17.760 07:13:50 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:08:17.760 INFO: shutting down applications... 00:08:17.760 07:13:50 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:08:17.760 07:13:50 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:08:17.760 07:13:50 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:08:17.760 07:13:50 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:08:18.018 [2024-07-25 07:13:50.399732] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:08:20.550 Calling clear_iscsi_subsystem 00:08:20.550 Calling clear_nvmf_subsystem 00:08:20.550 Calling clear_nbd_subsystem 00:08:20.550 Calling clear_ublk_subsystem 00:08:20.550 Calling clear_vhost_blk_subsystem 00:08:20.550 Calling clear_vhost_scsi_subsystem 00:08:20.550 Calling clear_bdev_subsystem 00:08:20.550 07:13:52 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:08:20.550 07:13:52 json_config -- json_config/json_config.sh@347 -- # count=100 00:08:20.550 07:13:52 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:08:20.550 07:13:52 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:20.550 07:13:52 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:08:20.550 07:13:52 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:08:20.808 07:13:53 json_config -- json_config/json_config.sh@349 -- # break 00:08:20.808 07:13:53 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:08:20.808 07:13:53 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:08:20.808 07:13:53 json_config -- json_config/common.sh@31 -- # local app=target 00:08:20.808 07:13:53 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:20.808 07:13:53 json_config -- json_config/common.sh@35 -- # [[ -n 1537826 ]] 00:08:20.808 07:13:53 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1537826 00:08:20.808 07:13:53 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:20.808 07:13:53 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:20.808 07:13:53 json_config -- json_config/common.sh@41 -- # kill -0 1537826 00:08:20.808 07:13:53 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:08:21.375 07:13:53 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:08:21.375 07:13:53 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:21.375 07:13:53 json_config -- json_config/common.sh@41 -- # kill -0 1537826 00:08:21.375 07:13:53 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:21.375 07:13:53 json_config -- json_config/common.sh@43 -- # break 00:08:21.375 07:13:53 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:21.375 07:13:53 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:21.375 SPDK target shutdown done 00:08:21.375 07:13:53 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:08:21.375 INFO: relaunching applications... 00:08:21.375 07:13:53 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:21.375 07:13:53 json_config -- json_config/common.sh@9 -- # local app=target 00:08:21.375 07:13:53 json_config -- json_config/common.sh@10 -- # shift 00:08:21.376 07:13:53 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:21.376 07:13:53 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:21.376 07:13:53 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:08:21.376 07:13:53 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:21.376 07:13:53 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:21.376 07:13:53 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1541247 00:08:21.376 07:13:53 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:21.376 Waiting for target to run... 00:08:21.376 07:13:53 json_config -- json_config/common.sh@25 -- # waitforlisten 1541247 /var/tmp/spdk_tgt.sock 00:08:21.376 07:13:53 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:21.376 07:13:53 json_config -- common/autotest_common.sh@831 -- # '[' -z 1541247 ']' 00:08:21.376 07:13:53 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:21.376 07:13:53 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:21.376 07:13:53 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:21.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:21.376 07:13:53 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:21.376 07:13:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:21.376 [2024-07-25 07:13:53.841687] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:21.376 [2024-07-25 07:13:53.841754] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1541247 ] 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:21.944 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.944 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:21.945 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:21.945 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:21.945 [2024-07-25 07:13:54.353663] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.945 [2024-07-25 07:13:54.448734] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.203 [2024-07-25 07:13:54.502802] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:08:22.203 [2024-07-25 07:13:54.510837] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:08:22.203 [2024-07-25 07:13:54.518855] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:08:22.203 [2024-07-25 07:13:54.599690] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:08:24.754 [2024-07-25 07:13:56.739000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.754 [2024-07-25 07:13:56.739058] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:24.754 [2024-07-25 07:13:56.739071] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:24.754 [2024-07-25 07:13:56.747016] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:08:24.754 [2024-07-25 07:13:56.747041] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:08:24.754 [2024-07-25 07:13:56.755033] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:24.754 [2024-07-25 07:13:56.755060] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:24.754 [2024-07-25 07:13:56.763063] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:08:24.754 [2024-07-25 07:13:56.763089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:08:24.754 [2024-07-25 07:13:56.763100] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:27.318 [2024-07-25 07:13:59.661197] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:27.318 [2024-07-25 07:13:59.661244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:27.318 [2024-07-25 07:13:59.661260] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb6af0 00:08:27.318 [2024-07-25 07:13:59.661272] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:27.318 [2024-07-25 07:13:59.661544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:27.318 [2024-07-25 07:13:59.661562] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:08:27.576 07:13:59 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:27.576 07:13:59 json_config -- common/autotest_common.sh@864 -- # return 0 00:08:27.576 07:13:59 json_config -- json_config/common.sh@26 -- # echo '' 00:08:27.576 00:08:27.576 07:13:59 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:08:27.576 07:13:59 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:08:27.576 INFO: Checking if target configuration is the same... 00:08:27.576 07:13:59 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:08:27.576 07:13:59 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:27.576 07:13:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:27.576 + '[' 2 -ne 2 ']' 00:08:27.576 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:08:27.576 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:08:27.576 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:27.576 +++ basename /dev/fd/62 00:08:27.576 ++ mktemp /tmp/62.XXX 00:08:27.576 + tmp_file_1=/tmp/62.a9N 00:08:27.576 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:27.576 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:08:27.576 + tmp_file_2=/tmp/spdk_tgt_config.json.WPM 00:08:27.576 + ret=0 00:08:27.576 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:27.834 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:27.834 + diff -u /tmp/62.a9N /tmp/spdk_tgt_config.json.WPM 00:08:27.834 + echo 'INFO: JSON config files are the same' 00:08:27.834 INFO: JSON config files are the same 00:08:27.834 + rm /tmp/62.a9N /tmp/spdk_tgt_config.json.WPM 00:08:27.834 + exit 0 00:08:28.093 07:14:00 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:08:28.093 07:14:00 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:08:28.093 INFO: changing configuration and checking if this can be detected... 00:08:28.093 07:14:00 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:08:28.093 07:14:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:08:28.093 07:14:00 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:28.093 07:14:00 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:08:28.093 07:14:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:28.093 + '[' 2 -ne 2 ']' 00:08:28.093 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:08:28.093 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:08:28.351 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:28.351 +++ basename /dev/fd/62 00:08:28.351 ++ mktemp /tmp/62.XXX 00:08:28.351 + tmp_file_1=/tmp/62.ng3 00:08:28.351 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:28.351 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:08:28.351 + tmp_file_2=/tmp/spdk_tgt_config.json.UY7 00:08:28.351 + ret=0 00:08:28.351 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:28.609 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:28.609 + diff -u /tmp/62.ng3 /tmp/spdk_tgt_config.json.UY7 00:08:28.609 + ret=1 00:08:28.609 + echo '=== Start of file: /tmp/62.ng3 ===' 00:08:28.609 + cat /tmp/62.ng3 00:08:28.609 + echo '=== End of file: /tmp/62.ng3 ===' 00:08:28.609 + echo '' 00:08:28.609 + echo '=== Start of file: /tmp/spdk_tgt_config.json.UY7 ===' 00:08:28.609 + cat /tmp/spdk_tgt_config.json.UY7 00:08:28.609 + echo '=== End of file: /tmp/spdk_tgt_config.json.UY7 ===' 00:08:28.609 + echo '' 00:08:28.609 + rm /tmp/62.ng3 /tmp/spdk_tgt_config.json.UY7 00:08:28.609 + exit 1 00:08:28.609 07:14:01 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:08:28.609 INFO: configuration change detected. 00:08:28.609 07:14:01 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:08:28.609 07:14:01 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:08:28.609 07:14:01 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:28.609 07:14:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:28.609 07:14:01 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:08:28.609 07:14:01 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:08:28.609 07:14:01 json_config -- json_config/json_config.sh@321 -- # [[ -n 1541247 ]] 00:08:28.610 07:14:01 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:08:28.610 07:14:01 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:08:28.610 07:14:01 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:28.610 07:14:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:28.610 07:14:01 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:08:28.610 07:14:01 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:08:28.610 07:14:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:08:28.868 07:14:01 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:08:28.868 07:14:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:08:29.126 07:14:01 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:08:29.126 07:14:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:08:29.384 07:14:01 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:08:29.384 07:14:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:08:29.643 07:14:01 json_config -- json_config/json_config.sh@197 -- # uname -s 00:08:29.643 07:14:01 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:08:29.643 07:14:01 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:08:29.643 07:14:01 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:08:29.643 07:14:01 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:08:29.643 07:14:01 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:29.643 07:14:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:29.643 07:14:02 json_config -- json_config/json_config.sh@327 -- # killprocess 1541247 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@950 -- # '[' -z 1541247 ']' 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@954 -- # kill -0 1541247 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@955 -- # uname 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1541247 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1541247' 00:08:29.643 killing process with pid 1541247 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@969 -- # kill 1541247 00:08:29.643 07:14:02 json_config -- common/autotest_common.sh@974 -- # wait 1541247 00:08:32.179 07:14:04 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:32.179 07:14:04 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:08:32.179 07:14:04 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:32.179 07:14:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:32.179 07:14:04 json_config -- json_config/json_config.sh@332 -- # return 0 00:08:32.179 07:14:04 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:08:32.179 INFO: Success 00:08:32.179 00:08:32.179 real 0m30.598s 00:08:32.179 user 0m35.371s 00:08:32.179 sys 0m3.690s 00:08:32.179 07:14:04 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:32.179 07:14:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:32.179 ************************************ 00:08:32.179 END TEST json_config 00:08:32.179 ************************************ 00:08:32.179 07:14:04 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:32.179 07:14:04 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:32.179 07:14:04 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:32.179 07:14:04 -- common/autotest_common.sh@10 -- # set +x 00:08:32.179 ************************************ 00:08:32.179 START TEST json_config_extra_key 00:08:32.179 ************************************ 00:08:32.179 07:14:04 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:32.437 07:14:04 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:32.437 07:14:04 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:32.437 07:14:04 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:32.437 07:14:04 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.437 07:14:04 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.437 07:14:04 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.437 07:14:04 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:08:32.437 07:14:04 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:32.437 07:14:04 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:08:32.437 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:08:32.438 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:08:32.438 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:08:32.438 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:32.438 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:08:32.438 INFO: launching applications... 00:08:32.438 07:14:04 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1543223 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:32.438 Waiting for target to run... 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1543223 /var/tmp/spdk_tgt.sock 00:08:32.438 07:14:04 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 1543223 ']' 00:08:32.438 07:14:04 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:32.438 07:14:04 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:32.438 07:14:04 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:32.438 07:14:04 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:32.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:32.438 07:14:04 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:32.438 07:14:04 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:32.438 [2024-07-25 07:14:04.883415] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:32.438 [2024-07-25 07:14:04.883487] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1543223 ] 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:32.696 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.696 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:32.697 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:32.697 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:32.955 [2024-07-25 07:14:05.253730] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.955 [2024-07-25 07:14:05.331330] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.521 07:14:05 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:33.521 07:14:05 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:08:33.521 00:08:33.521 07:14:05 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:08:33.521 INFO: shutting down applications... 00:08:33.521 07:14:05 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1543223 ]] 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1543223 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1543223 00:08:33.521 07:14:05 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1543223 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@43 -- # break 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:33.779 07:14:06 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:33.779 SPDK target shutdown done 00:08:33.779 07:14:06 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:08:33.779 Success 00:08:33.779 00:08:33.779 real 0m1.575s 00:08:33.779 user 0m1.200s 00:08:33.779 sys 0m0.496s 00:08:33.779 07:14:06 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:33.779 07:14:06 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:33.779 ************************************ 00:08:33.779 END TEST json_config_extra_key 00:08:33.779 ************************************ 00:08:34.037 07:14:06 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:34.037 07:14:06 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:34.037 07:14:06 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:34.037 07:14:06 -- common/autotest_common.sh@10 -- # set +x 00:08:34.037 ************************************ 00:08:34.037 START TEST alias_rpc 00:08:34.037 ************************************ 00:08:34.037 07:14:06 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:34.037 * Looking for test storage... 00:08:34.037 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:08:34.037 07:14:06 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:34.037 07:14:06 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1543568 00:08:34.037 07:14:06 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:34.037 07:14:06 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1543568 00:08:34.037 07:14:06 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 1543568 ']' 00:08:34.038 07:14:06 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.038 07:14:06 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:34.038 07:14:06 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.038 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.038 07:14:06 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:34.038 07:14:06 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:34.038 [2024-07-25 07:14:06.530807] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:34.038 [2024-07-25 07:14:06.530874] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1543568 ] 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.296 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:34.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.297 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:34.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.297 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:34.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.297 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:34.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:34.297 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:34.297 [2024-07-25 07:14:06.663478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.297 [2024-07-25 07:14:06.749039] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.231 07:14:07 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:35.231 07:14:07 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:08:35.232 07:14:07 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:08:35.232 07:14:07 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1543568 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 1543568 ']' 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 1543568 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1543568 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1543568' 00:08:35.232 killing process with pid 1543568 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@969 -- # kill 1543568 00:08:35.232 07:14:07 alias_rpc -- common/autotest_common.sh@974 -- # wait 1543568 00:08:35.798 00:08:35.798 real 0m1.693s 00:08:35.798 user 0m1.838s 00:08:35.798 sys 0m0.542s 00:08:35.798 07:14:08 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:35.798 07:14:08 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:35.798 ************************************ 00:08:35.798 END TEST alias_rpc 00:08:35.798 ************************************ 00:08:35.798 07:14:08 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:08:35.798 07:14:08 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:35.798 07:14:08 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:35.798 07:14:08 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:35.798 07:14:08 -- common/autotest_common.sh@10 -- # set +x 00:08:35.798 ************************************ 00:08:35.798 START TEST spdkcli_tcp 00:08:35.798 ************************************ 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:35.798 * Looking for test storage... 00:08:35.798 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1543983 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1543983 00:08:35.798 07:14:08 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 1543983 ']' 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:35.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:35.798 07:14:08 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:35.798 [2024-07-25 07:14:08.305553] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:35.799 [2024-07-25 07:14:08.305618] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1543983 ] 00:08:36.056 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:36.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:36.057 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:36.057 [2024-07-25 07:14:08.437526] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:36.057 [2024-07-25 07:14:08.525339] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:36.057 [2024-07-25 07:14:08.525346] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.990 07:14:09 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:36.990 07:14:09 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:08:36.990 07:14:09 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1544115 00:08:36.990 07:14:09 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:08:36.990 07:14:09 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:08:36.990 [ 00:08:36.990 "bdev_malloc_delete", 00:08:36.990 "bdev_malloc_create", 00:08:36.990 "bdev_null_resize", 00:08:36.990 "bdev_null_delete", 00:08:36.990 "bdev_null_create", 00:08:36.990 "bdev_nvme_cuse_unregister", 00:08:36.990 "bdev_nvme_cuse_register", 00:08:36.990 "bdev_opal_new_user", 00:08:36.990 "bdev_opal_set_lock_state", 00:08:36.990 "bdev_opal_delete", 00:08:36.990 "bdev_opal_get_info", 00:08:36.990 "bdev_opal_create", 00:08:36.990 "bdev_nvme_opal_revert", 00:08:36.990 "bdev_nvme_opal_init", 00:08:36.990 "bdev_nvme_send_cmd", 00:08:36.990 "bdev_nvme_get_path_iostat", 00:08:36.990 "bdev_nvme_get_mdns_discovery_info", 00:08:36.990 "bdev_nvme_stop_mdns_discovery", 00:08:36.990 "bdev_nvme_start_mdns_discovery", 00:08:36.990 "bdev_nvme_set_multipath_policy", 00:08:36.990 "bdev_nvme_set_preferred_path", 00:08:36.990 "bdev_nvme_get_io_paths", 00:08:36.990 "bdev_nvme_remove_error_injection", 00:08:36.990 "bdev_nvme_add_error_injection", 00:08:36.990 "bdev_nvme_get_discovery_info", 00:08:36.990 "bdev_nvme_stop_discovery", 00:08:36.990 "bdev_nvme_start_discovery", 00:08:36.990 "bdev_nvme_get_controller_health_info", 00:08:36.990 "bdev_nvme_disable_controller", 00:08:36.990 "bdev_nvme_enable_controller", 00:08:36.990 "bdev_nvme_reset_controller", 00:08:36.990 "bdev_nvme_get_transport_statistics", 00:08:36.990 "bdev_nvme_apply_firmware", 00:08:36.990 "bdev_nvme_detach_controller", 00:08:36.990 "bdev_nvme_get_controllers", 00:08:36.990 "bdev_nvme_attach_controller", 00:08:36.990 "bdev_nvme_set_hotplug", 00:08:36.990 "bdev_nvme_set_options", 00:08:36.990 "bdev_passthru_delete", 00:08:36.990 "bdev_passthru_create", 00:08:36.990 "bdev_lvol_set_parent_bdev", 00:08:36.990 "bdev_lvol_set_parent", 00:08:36.990 "bdev_lvol_check_shallow_copy", 00:08:36.990 "bdev_lvol_start_shallow_copy", 00:08:36.990 "bdev_lvol_grow_lvstore", 00:08:36.990 "bdev_lvol_get_lvols", 00:08:36.990 "bdev_lvol_get_lvstores", 00:08:36.990 "bdev_lvol_delete", 00:08:36.990 "bdev_lvol_set_read_only", 00:08:36.990 "bdev_lvol_resize", 00:08:36.990 "bdev_lvol_decouple_parent", 00:08:36.990 "bdev_lvol_inflate", 00:08:36.990 "bdev_lvol_rename", 00:08:36.990 "bdev_lvol_clone_bdev", 00:08:36.990 "bdev_lvol_clone", 00:08:36.990 "bdev_lvol_snapshot", 00:08:36.990 "bdev_lvol_create", 00:08:36.990 "bdev_lvol_delete_lvstore", 00:08:36.990 "bdev_lvol_rename_lvstore", 00:08:36.990 "bdev_lvol_create_lvstore", 00:08:36.990 "bdev_raid_set_options", 00:08:36.990 "bdev_raid_remove_base_bdev", 00:08:36.990 "bdev_raid_add_base_bdev", 00:08:36.990 "bdev_raid_delete", 00:08:36.990 "bdev_raid_create", 00:08:36.990 "bdev_raid_get_bdevs", 00:08:36.990 "bdev_error_inject_error", 00:08:36.990 "bdev_error_delete", 00:08:36.990 "bdev_error_create", 00:08:36.990 "bdev_split_delete", 00:08:36.990 "bdev_split_create", 00:08:36.990 "bdev_delay_delete", 00:08:36.990 "bdev_delay_create", 00:08:36.990 "bdev_delay_update_latency", 00:08:36.990 "bdev_zone_block_delete", 00:08:36.990 "bdev_zone_block_create", 00:08:36.990 "blobfs_create", 00:08:36.990 "blobfs_detect", 00:08:36.990 "blobfs_set_cache_size", 00:08:36.990 "bdev_crypto_delete", 00:08:36.990 "bdev_crypto_create", 00:08:36.990 "bdev_compress_delete", 00:08:36.990 "bdev_compress_create", 00:08:36.990 "bdev_compress_get_orphans", 00:08:36.990 "bdev_aio_delete", 00:08:36.990 "bdev_aio_rescan", 00:08:36.990 "bdev_aio_create", 00:08:36.990 "bdev_ftl_set_property", 00:08:36.990 "bdev_ftl_get_properties", 00:08:36.990 "bdev_ftl_get_stats", 00:08:36.990 "bdev_ftl_unmap", 00:08:36.990 "bdev_ftl_unload", 00:08:36.990 "bdev_ftl_delete", 00:08:36.990 "bdev_ftl_load", 00:08:36.990 "bdev_ftl_create", 00:08:36.990 "bdev_virtio_attach_controller", 00:08:36.990 "bdev_virtio_scsi_get_devices", 00:08:36.990 "bdev_virtio_detach_controller", 00:08:36.990 "bdev_virtio_blk_set_hotplug", 00:08:36.990 "bdev_iscsi_delete", 00:08:36.990 "bdev_iscsi_create", 00:08:36.990 "bdev_iscsi_set_options", 00:08:36.990 "accel_error_inject_error", 00:08:36.990 "ioat_scan_accel_module", 00:08:36.990 "dsa_scan_accel_module", 00:08:36.990 "iaa_scan_accel_module", 00:08:36.990 "dpdk_cryptodev_get_driver", 00:08:36.990 "dpdk_cryptodev_set_driver", 00:08:36.990 "dpdk_cryptodev_scan_accel_module", 00:08:36.990 "compressdev_scan_accel_module", 00:08:36.990 "keyring_file_remove_key", 00:08:36.990 "keyring_file_add_key", 00:08:36.990 "keyring_linux_set_options", 00:08:36.990 "iscsi_get_histogram", 00:08:36.990 "iscsi_enable_histogram", 00:08:36.990 "iscsi_set_options", 00:08:36.990 "iscsi_get_auth_groups", 00:08:36.990 "iscsi_auth_group_remove_secret", 00:08:36.990 "iscsi_auth_group_add_secret", 00:08:36.990 "iscsi_delete_auth_group", 00:08:36.990 "iscsi_create_auth_group", 00:08:36.991 "iscsi_set_discovery_auth", 00:08:36.991 "iscsi_get_options", 00:08:36.991 "iscsi_target_node_request_logout", 00:08:36.991 "iscsi_target_node_set_redirect", 00:08:36.991 "iscsi_target_node_set_auth", 00:08:36.991 "iscsi_target_node_add_lun", 00:08:36.991 "iscsi_get_stats", 00:08:36.991 "iscsi_get_connections", 00:08:36.991 "iscsi_portal_group_set_auth", 00:08:36.991 "iscsi_start_portal_group", 00:08:36.991 "iscsi_delete_portal_group", 00:08:36.991 "iscsi_create_portal_group", 00:08:36.991 "iscsi_get_portal_groups", 00:08:36.991 "iscsi_delete_target_node", 00:08:36.991 "iscsi_target_node_remove_pg_ig_maps", 00:08:36.991 "iscsi_target_node_add_pg_ig_maps", 00:08:36.991 "iscsi_create_target_node", 00:08:36.991 "iscsi_get_target_nodes", 00:08:36.991 "iscsi_delete_initiator_group", 00:08:36.991 "iscsi_initiator_group_remove_initiators", 00:08:36.991 "iscsi_initiator_group_add_initiators", 00:08:36.991 "iscsi_create_initiator_group", 00:08:36.991 "iscsi_get_initiator_groups", 00:08:36.991 "nvmf_set_crdt", 00:08:36.991 "nvmf_set_config", 00:08:36.991 "nvmf_set_max_subsystems", 00:08:36.991 "nvmf_stop_mdns_prr", 00:08:36.991 "nvmf_publish_mdns_prr", 00:08:36.991 "nvmf_subsystem_get_listeners", 00:08:36.991 "nvmf_subsystem_get_qpairs", 00:08:36.991 "nvmf_subsystem_get_controllers", 00:08:36.991 "nvmf_get_stats", 00:08:36.991 "nvmf_get_transports", 00:08:36.991 "nvmf_create_transport", 00:08:36.991 "nvmf_get_targets", 00:08:36.991 "nvmf_delete_target", 00:08:36.991 "nvmf_create_target", 00:08:36.991 "nvmf_subsystem_allow_any_host", 00:08:36.991 "nvmf_subsystem_remove_host", 00:08:36.991 "nvmf_subsystem_add_host", 00:08:36.991 "nvmf_ns_remove_host", 00:08:36.991 "nvmf_ns_add_host", 00:08:36.991 "nvmf_subsystem_remove_ns", 00:08:36.991 "nvmf_subsystem_add_ns", 00:08:36.991 "nvmf_subsystem_listener_set_ana_state", 00:08:36.991 "nvmf_discovery_get_referrals", 00:08:36.991 "nvmf_discovery_remove_referral", 00:08:36.991 "nvmf_discovery_add_referral", 00:08:36.991 "nvmf_subsystem_remove_listener", 00:08:36.991 "nvmf_subsystem_add_listener", 00:08:36.991 "nvmf_delete_subsystem", 00:08:36.991 "nvmf_create_subsystem", 00:08:36.991 "nvmf_get_subsystems", 00:08:36.991 "env_dpdk_get_mem_stats", 00:08:36.991 "nbd_get_disks", 00:08:36.991 "nbd_stop_disk", 00:08:36.991 "nbd_start_disk", 00:08:36.991 "ublk_recover_disk", 00:08:36.991 "ublk_get_disks", 00:08:36.991 "ublk_stop_disk", 00:08:36.991 "ublk_start_disk", 00:08:36.991 "ublk_destroy_target", 00:08:36.991 "ublk_create_target", 00:08:36.991 "virtio_blk_create_transport", 00:08:36.991 "virtio_blk_get_transports", 00:08:36.991 "vhost_controller_set_coalescing", 00:08:36.991 "vhost_get_controllers", 00:08:36.991 "vhost_delete_controller", 00:08:36.991 "vhost_create_blk_controller", 00:08:36.991 "vhost_scsi_controller_remove_target", 00:08:36.991 "vhost_scsi_controller_add_target", 00:08:36.991 "vhost_start_scsi_controller", 00:08:36.991 "vhost_create_scsi_controller", 00:08:36.991 "thread_set_cpumask", 00:08:36.991 "scheduler_set_options", 00:08:36.991 "framework_get_governor", 00:08:36.991 "framework_get_scheduler", 00:08:36.991 "framework_set_scheduler", 00:08:36.991 "framework_get_reactors", 00:08:36.991 "thread_get_io_channels", 00:08:36.991 "thread_get_pollers", 00:08:36.991 "thread_get_stats", 00:08:36.991 "framework_monitor_context_switch", 00:08:36.991 "spdk_kill_instance", 00:08:36.991 "log_enable_timestamps", 00:08:36.991 "log_get_flags", 00:08:36.991 "log_clear_flag", 00:08:36.991 "log_set_flag", 00:08:36.991 "log_get_level", 00:08:36.991 "log_set_level", 00:08:36.991 "log_get_print_level", 00:08:36.991 "log_set_print_level", 00:08:36.991 "framework_enable_cpumask_locks", 00:08:36.991 "framework_disable_cpumask_locks", 00:08:36.991 "framework_wait_init", 00:08:36.991 "framework_start_init", 00:08:36.991 "scsi_get_devices", 00:08:36.991 "bdev_get_histogram", 00:08:36.991 "bdev_enable_histogram", 00:08:36.991 "bdev_set_qos_limit", 00:08:36.991 "bdev_set_qd_sampling_period", 00:08:36.991 "bdev_get_bdevs", 00:08:36.991 "bdev_reset_iostat", 00:08:36.991 "bdev_get_iostat", 00:08:36.991 "bdev_examine", 00:08:36.991 "bdev_wait_for_examine", 00:08:36.991 "bdev_set_options", 00:08:36.991 "notify_get_notifications", 00:08:36.991 "notify_get_types", 00:08:36.991 "accel_get_stats", 00:08:36.991 "accel_set_options", 00:08:36.991 "accel_set_driver", 00:08:36.991 "accel_crypto_key_destroy", 00:08:36.991 "accel_crypto_keys_get", 00:08:36.991 "accel_crypto_key_create", 00:08:36.991 "accel_assign_opc", 00:08:36.991 "accel_get_module_info", 00:08:36.991 "accel_get_opc_assignments", 00:08:36.991 "vmd_rescan", 00:08:36.991 "vmd_remove_device", 00:08:36.991 "vmd_enable", 00:08:36.991 "sock_get_default_impl", 00:08:36.991 "sock_set_default_impl", 00:08:36.991 "sock_impl_set_options", 00:08:36.991 "sock_impl_get_options", 00:08:36.991 "iobuf_get_stats", 00:08:36.991 "iobuf_set_options", 00:08:36.991 "framework_get_pci_devices", 00:08:36.991 "framework_get_config", 00:08:36.991 "framework_get_subsystems", 00:08:36.991 "trace_get_info", 00:08:36.991 "trace_get_tpoint_group_mask", 00:08:36.991 "trace_disable_tpoint_group", 00:08:36.991 "trace_enable_tpoint_group", 00:08:36.991 "trace_clear_tpoint_mask", 00:08:36.991 "trace_set_tpoint_mask", 00:08:36.991 "keyring_get_keys", 00:08:36.991 "spdk_get_version", 00:08:36.991 "rpc_get_methods" 00:08:36.991 ] 00:08:36.991 07:14:09 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:36.991 07:14:09 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:08:36.991 07:14:09 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1543983 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 1543983 ']' 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 1543983 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:36.991 07:14:09 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1543983 00:08:37.249 07:14:09 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:37.249 07:14:09 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:37.249 07:14:09 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1543983' 00:08:37.249 killing process with pid 1543983 00:08:37.249 07:14:09 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 1543983 00:08:37.249 07:14:09 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 1543983 00:08:37.507 00:08:37.507 real 0m1.736s 00:08:37.507 user 0m3.133s 00:08:37.507 sys 0m0.606s 00:08:37.507 07:14:09 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:37.507 07:14:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:37.507 ************************************ 00:08:37.507 END TEST spdkcli_tcp 00:08:37.507 ************************************ 00:08:37.507 07:14:09 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:37.507 07:14:09 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:37.507 07:14:09 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:37.507 07:14:09 -- common/autotest_common.sh@10 -- # set +x 00:08:37.507 ************************************ 00:08:37.507 START TEST dpdk_mem_utility 00:08:37.507 ************************************ 00:08:37.507 07:14:09 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:37.507 * Looking for test storage... 00:08:37.764 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:08:37.764 07:14:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:37.764 07:14:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1544432 00:08:37.765 07:14:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1544432 00:08:37.765 07:14:10 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:37.765 07:14:10 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 1544432 ']' 00:08:37.765 07:14:10 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:37.765 07:14:10 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:37.765 07:14:10 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:37.765 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:37.765 07:14:10 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:37.765 07:14:10 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:37.765 [2024-07-25 07:14:10.113012] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:37.765 [2024-07-25 07:14:10.113072] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1544432 ] 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:37.765 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:37.765 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:37.765 [2024-07-25 07:14:10.249269] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.022 [2024-07-25 07:14:10.335438] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.587 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:38.587 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:08:38.587 07:14:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:08:38.587 07:14:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:08:38.587 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:38.587 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:38.587 { 00:08:38.587 "filename": "/tmp/spdk_mem_dump.txt" 00:08:38.587 } 00:08:38.587 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:38.587 07:14:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:38.587 DPDK memory size 814.000000 MiB in 1 heap(s) 00:08:38.587 1 heaps totaling size 814.000000 MiB 00:08:38.587 size: 814.000000 MiB heap id: 0 00:08:38.587 end heaps---------- 00:08:38.587 8 mempools totaling size 598.116089 MiB 00:08:38.587 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:08:38.587 size: 158.602051 MiB name: PDU_data_out_Pool 00:08:38.587 size: 84.521057 MiB name: bdev_io_1544432 00:08:38.587 size: 51.011292 MiB name: evtpool_1544432 00:08:38.587 size: 50.003479 MiB name: msgpool_1544432 00:08:38.587 size: 21.763794 MiB name: PDU_Pool 00:08:38.587 size: 19.513306 MiB name: SCSI_TASK_Pool 00:08:38.587 size: 0.026123 MiB name: Session_Pool 00:08:38.587 end mempools------- 00:08:38.587 201 memzones totaling size 4.176453 MiB 00:08:38.587 size: 1.000366 MiB name: RG_ring_0_1544432 00:08:38.587 size: 1.000366 MiB name: RG_ring_1_1544432 00:08:38.587 size: 1.000366 MiB name: RG_ring_4_1544432 00:08:38.587 size: 1.000366 MiB name: RG_ring_5_1544432 00:08:38.587 size: 0.125366 MiB name: RG_ring_2_1544432 00:08:38.587 size: 0.015991 MiB name: RG_ring_3_1544432 00:08:38.587 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.0_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.1_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.2_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.3_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.4_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.5_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.6_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:01.7_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.0_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.1_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.2_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.3_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.4_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.5_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.6_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1a:02.7_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.0_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.1_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.2_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.3_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.4_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.5_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.6_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:01.7_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.0_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.1_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.2_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.3_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.4_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.5_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.6_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1c:02.7_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.0_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.1_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.2_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.3_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.4_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.5_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.6_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:01.7_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.0_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.1_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.2_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.3_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.4_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.5_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.6_qat 00:08:38.587 size: 0.000305 MiB name: 0000:1e:02.7_qat 00:08:38.587 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_0 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_1 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_2 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_3 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_4 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_5 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_6 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_7 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_8 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_9 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_10 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_11 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_12 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_13 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_14 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_15 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_16 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_17 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_18 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_19 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_20 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_21 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_22 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_23 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_24 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_25 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_26 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_27 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_28 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_29 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_30 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_31 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_64 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_65 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_32 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_66 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_67 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_33 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_68 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_69 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_34 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_70 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_71 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_35 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_72 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_73 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_36 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_74 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_75 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_37 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_76 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_77 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_38 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_78 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_79 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_39 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_80 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_81 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_40 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_82 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_83 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_41 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_84 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_85 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_42 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_86 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_87 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_43 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_88 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_89 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_44 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_90 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_91 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_45 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_92 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_93 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_46 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_94 00:08:38.588 size: 0.000122 MiB name: rte_cryptodev_data_95 00:08:38.588 size: 0.000122 MiB name: rte_compressdev_data_47 00:08:38.588 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:38.589 end memzones------- 00:08:38.589 07:14:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:08:38.850 heap id: 0 total size: 814.000000 MiB number of busy elements: 635 number of free elements: 14 00:08:38.850 list of free elements. size: 11.781738 MiB 00:08:38.850 element at address: 0x200000400000 with size: 1.999512 MiB 00:08:38.850 element at address: 0x200018e00000 with size: 0.999878 MiB 00:08:38.850 element at address: 0x200019000000 with size: 0.999878 MiB 00:08:38.850 element at address: 0x200003e00000 with size: 0.996460 MiB 00:08:38.850 element at address: 0x200031c00000 with size: 0.994446 MiB 00:08:38.850 element at address: 0x200013800000 with size: 0.978699 MiB 00:08:38.851 element at address: 0x200007000000 with size: 0.959839 MiB 00:08:38.851 element at address: 0x200019200000 with size: 0.936584 MiB 00:08:38.851 element at address: 0x20001aa00000 with size: 0.564758 MiB 00:08:38.851 element at address: 0x200003a00000 with size: 0.494507 MiB 00:08:38.851 element at address: 0x20000b200000 with size: 0.489075 MiB 00:08:38.851 element at address: 0x200000800000 with size: 0.486694 MiB 00:08:38.851 element at address: 0x200019400000 with size: 0.485657 MiB 00:08:38.851 element at address: 0x200027e00000 with size: 0.395752 MiB 00:08:38.851 list of standard malloc elements. size: 199.898254 MiB 00:08:38.851 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:08:38.851 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:08:38.851 element at address: 0x200018efff80 with size: 1.000122 MiB 00:08:38.851 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:08:38.851 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:08:38.851 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:08:38.851 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:08:38.851 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:08:38.851 element at address: 0x20000032bc80 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000032f740 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000333200 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000336cc0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000033a780 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000033e240 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000341d00 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003457c0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000349280 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000034cd40 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000350800 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003542c0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000357d80 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000035b840 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000035f300 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000366880 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000036a340 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000036de00 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000375380 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000378e40 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000037c900 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000383e80 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000387940 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000038b400 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000392980 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000396440 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000399f00 with size: 0.004395 MiB 00:08:38.851 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:08:38.851 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:08:38.851 element at address: 0x200000329b80 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000032ac00 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000032d640 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000032e6c0 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000331100 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000332180 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000334bc0 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000335c40 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000338680 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000339700 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000033c140 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000033d1c0 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000033fc00 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000340c80 with size: 0.004028 MiB 00:08:38.851 element at address: 0x2000003436c0 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000344740 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000347180 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000348200 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000034ac40 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000034bcc0 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000034e700 with size: 0.004028 MiB 00:08:38.851 element at address: 0x20000034f780 with size: 0.004028 MiB 00:08:38.851 element at address: 0x2000003521c0 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000353240 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000355c80 with size: 0.004028 MiB 00:08:38.851 element at address: 0x200000356d00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000359740 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000035a7c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000035d200 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000035e280 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000361d40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000364780 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000365800 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000368240 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000370840 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000373280 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000374300 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000376d40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000037a800 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000037b880 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000037f340 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000381d80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000382e00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000385840 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000389300 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000038a380 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000038de40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000390880 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000391900 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000394340 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000397e00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000398e80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000039c940 with size: 0.004028 MiB 00:08:38.852 element at address: 0x20000039f380 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:08:38.852 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:08:38.852 element at address: 0x200000200000 with size: 0.000305 MiB 00:08:38.852 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:08:38.852 element at address: 0x200000200140 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200200 with size: 0.000183 MiB 00:08:38.852 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200380 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200440 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200500 with size: 0.000183 MiB 00:08:38.852 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200680 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200740 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200800 with size: 0.000183 MiB 00:08:38.852 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200980 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200a40 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200b00 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200c80 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200d40 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200e00 with size: 0.000183 MiB 00:08:38.852 element at address: 0x200000200ec0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000002010c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000205380 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225640 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225700 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000002257c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225880 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225940 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225a00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225b80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225c40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225d00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225dc0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225e80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000225f40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226000 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000002260c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226180 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226240 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226300 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226500 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000002265c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226680 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226740 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226800 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000002268c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226980 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226a40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226b00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226c80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226d40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226e00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000226f80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000227040 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000227100 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000329300 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000003293c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000329580 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000329640 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000329800 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000032ce80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000032d040 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000032d100 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000032d2c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000330940 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000330b00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000330bc0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000330d80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000334400 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000003345c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000334680 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000334840 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000337ec0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000338080 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000338140 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000338300 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033b980 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033bb40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033bc00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033f440 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033f600 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033f6c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000033f880 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000342f00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000003430c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000343180 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000343340 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000003469c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000346b80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000346c40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000346e00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034a480 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034a640 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034a700 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034a8c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034df40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034e100 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034e1c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x20000034e380 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000351a00 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000351bc0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000351c80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000351e40 with size: 0.000183 MiB 00:08:38.853 element at address: 0x2000003554c0 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000355680 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000355740 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000355900 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000358f80 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000359140 with size: 0.000183 MiB 00:08:38.853 element at address: 0x200000359200 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003593c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000035ca40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000035cc00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000035ccc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000035ce80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000360500 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003606c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000360780 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000360940 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000364180 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000364240 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000364400 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000367a80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000367c40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000367d00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036b540 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036b700 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036b980 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036f000 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036f280 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000036f440 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000372c80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000372d40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000372f00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000376580 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000376740 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000376800 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037a040 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037a200 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037a480 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037db00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000037df40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000381780 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000381840 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000381a00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000385080 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000385240 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000385300 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000388b40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000388d00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000388f80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000038c600 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000038c880 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000390280 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000390340 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000390500 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000393b80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000393d40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000393e00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000397640 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000397800 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x200000397a80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039b100 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039b380 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039b540 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:08:38.854 element at address: 0x20000039f000 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:08:38.854 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087c980 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7e980 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7ea40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7eb00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7ebc0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7ec80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7ed40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7ee00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7eec0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7ef80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f040 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f100 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f1c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f280 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:08:38.855 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:08:38.855 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:08:38.855 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:08:38.856 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:08:38.857 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:08:38.857 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:08:38.857 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e65500 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:08:38.857 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:08:38.857 list of memzone associated elements. size: 602.320007 MiB 00:08:38.857 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:08:38.857 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:08:38.857 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:08:38.857 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:08:38.857 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:08:38.857 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1544432_0 00:08:38.857 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:08:38.857 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1544432_0 00:08:38.857 element at address: 0x200003fff380 with size: 48.003052 MiB 00:08:38.857 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1544432_0 00:08:38.857 element at address: 0x2000195be940 with size: 20.255554 MiB 00:08:38.857 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:08:38.858 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:08:38.858 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:08:38.858 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:08:38.858 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1544432 00:08:38.858 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:08:38.858 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1544432 00:08:38.858 element at address: 0x2000002271c0 with size: 1.008118 MiB 00:08:38.858 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1544432 00:08:38.858 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:08:38.858 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:08:38.858 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:08:38.858 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:08:38.858 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:08:38.858 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:08:38.858 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:08:38.858 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:08:38.858 element at address: 0x200003eff180 with size: 1.000488 MiB 00:08:38.858 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1544432 00:08:38.858 element at address: 0x200003affc00 with size: 1.000488 MiB 00:08:38.858 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1544432 00:08:38.858 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:08:38.858 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1544432 00:08:38.858 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:08:38.858 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1544432 00:08:38.858 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:08:38.858 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1544432 00:08:38.858 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:08:38.858 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:08:38.858 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:08:38.858 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:08:38.858 element at address: 0x20001947c540 with size: 0.250488 MiB 00:08:38.858 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:08:38.858 element at address: 0x200000205440 with size: 0.125488 MiB 00:08:38.858 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1544432 00:08:38.858 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:08:38.858 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:08:38.858 element at address: 0x200027e65680 with size: 0.023743 MiB 00:08:38.858 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:08:38.858 element at address: 0x200000201180 with size: 0.016113 MiB 00:08:38.858 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1544432 00:08:38.858 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:08:38.858 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:08:38.858 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:08:38.858 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:38.858 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.0_qat 00:08:38.858 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.1_qat 00:08:38.858 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.2_qat 00:08:38.858 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.3_qat 00:08:38.858 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.4_qat 00:08:38.858 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.5_qat 00:08:38.858 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.6_qat 00:08:38.858 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.7_qat 00:08:38.858 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.0_qat 00:08:38.858 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.1_qat 00:08:38.858 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.2_qat 00:08:38.858 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:08:38.858 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.3_qat 00:08:38.858 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.4_qat 00:08:38.859 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.5_qat 00:08:38.859 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.6_qat 00:08:38.859 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.7_qat 00:08:38.859 element at address: 0x20000039b700 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.0_qat 00:08:38.859 element at address: 0x200000397c40 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.1_qat 00:08:38.859 element at address: 0x200000394180 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.2_qat 00:08:38.859 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.3_qat 00:08:38.859 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.4_qat 00:08:38.859 element at address: 0x200000389140 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.5_qat 00:08:38.859 element at address: 0x200000385680 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.6_qat 00:08:38.859 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.7_qat 00:08:38.859 element at address: 0x20000037e100 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.0_qat 00:08:38.859 element at address: 0x20000037a640 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.1_qat 00:08:38.859 element at address: 0x200000376b80 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.2_qat 00:08:38.859 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.3_qat 00:08:38.859 element at address: 0x20000036f600 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.4_qat 00:08:38.859 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.5_qat 00:08:38.859 element at address: 0x200000368080 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.6_qat 00:08:38.859 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.7_qat 00:08:38.859 element at address: 0x200000360b00 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.0_qat 00:08:38.859 element at address: 0x20000035d040 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.1_qat 00:08:38.859 element at address: 0x200000359580 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.2_qat 00:08:38.859 element at address: 0x200000355ac0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.3_qat 00:08:38.859 element at address: 0x200000352000 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.4_qat 00:08:38.859 element at address: 0x20000034e540 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.5_qat 00:08:38.859 element at address: 0x20000034aa80 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.6_qat 00:08:38.859 element at address: 0x200000346fc0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.7_qat 00:08:38.859 element at address: 0x200000343500 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.0_qat 00:08:38.859 element at address: 0x20000033fa40 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.1_qat 00:08:38.859 element at address: 0x20000033bf80 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.2_qat 00:08:38.859 element at address: 0x2000003384c0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.3_qat 00:08:38.859 element at address: 0x200000334a00 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.4_qat 00:08:38.859 element at address: 0x200000330f40 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.5_qat 00:08:38.859 element at address: 0x20000032d480 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.6_qat 00:08:38.859 element at address: 0x2000003299c0 with size: 0.000427 MiB 00:08:38.859 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.7_qat 00:08:38.859 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:08:38.859 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:38.859 element at address: 0x2000002263c0 with size: 0.000305 MiB 00:08:38.859 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1544432 00:08:38.859 element at address: 0x200000200f80 with size: 0.000305 MiB 00:08:38.859 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1544432 00:08:38.859 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:08:38.859 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:08:38.859 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:38.859 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:38.859 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:08:38.859 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:38.859 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:38.859 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:08:38.859 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:38.859 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:38.859 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:08:38.859 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:08:38.859 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:38.859 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:38.860 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:08:38.860 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:38.860 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:38.860 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:08:38.860 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:38.860 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:38.860 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:08:38.860 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:38.860 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:38.860 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:08:38.860 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:38.860 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:38.860 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:08:38.860 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:38.860 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:38.860 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:08:38.860 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:38.860 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:38.860 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:08:38.860 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:38.860 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:38.860 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:08:38.860 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:38.860 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:38.860 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:08:38.860 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:38.860 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:38.860 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:08:38.860 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:38.860 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:38.860 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:08:38.860 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:38.860 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:38.860 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:08:38.860 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:38.860 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:38.860 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:08:38.860 element at address: 0x20000039b600 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:38.860 element at address: 0x20000039b440 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:38.860 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:08:38.860 element at address: 0x200000397b40 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:38.860 element at address: 0x200000397980 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:38.860 element at address: 0x200000397700 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:08:38.860 element at address: 0x200000394080 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:38.860 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:38.860 element at address: 0x200000393c40 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:08:38.860 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:38.860 element at address: 0x200000390400 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:38.860 element at address: 0x200000390180 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:08:38.860 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:38.860 element at address: 0x20000038c940 with size: 0.000244 MiB 00:08:38.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:38.861 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:08:38.861 element at address: 0x200000389040 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:38.861 element at address: 0x200000388e80 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:38.861 element at address: 0x200000388c00 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:08:38.861 element at address: 0x200000385580 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:38.861 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:38.861 element at address: 0x200000385140 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:08:38.861 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:38.861 element at address: 0x200000381900 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:38.861 element at address: 0x200000381680 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:08:38.861 element at address: 0x20000037e000 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:38.861 element at address: 0x20000037de40 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:38.861 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:08:38.861 element at address: 0x20000037a540 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:38.861 element at address: 0x20000037a380 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:38.861 element at address: 0x20000037a100 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:08:38.861 element at address: 0x200000376a80 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:38.861 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:38.861 element at address: 0x200000376640 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:08:38.861 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:38.861 element at address: 0x200000372e00 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:38.861 element at address: 0x200000372b80 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:08:38.861 element at address: 0x20000036f500 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:38.861 element at address: 0x20000036f340 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:38.861 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:08:38.861 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:38.861 element at address: 0x20000036b880 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:38.861 element at address: 0x20000036b600 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:08:38.861 element at address: 0x200000367f80 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:38.861 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:38.861 element at address: 0x200000367b40 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:08:38.861 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:38.861 element at address: 0x200000364300 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:38.861 element at address: 0x200000364080 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:08:38.861 element at address: 0x200000360a00 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:08:38.861 element at address: 0x200000360840 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:08:38.861 element at address: 0x2000003605c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:08:38.861 element at address: 0x20000035cf40 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:08:38.861 element at address: 0x20000035cd80 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:08:38.861 element at address: 0x20000035cb00 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:08:38.861 element at address: 0x200000359480 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:08:38.861 element at address: 0x2000003592c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:08:38.861 element at address: 0x200000359040 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:08:38.861 element at address: 0x2000003559c0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:08:38.861 element at address: 0x200000355800 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:08:38.861 element at address: 0x200000355580 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:08:38.861 element at address: 0x200000351f00 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:08:38.861 element at address: 0x200000351d40 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:08:38.861 element at address: 0x200000351ac0 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:08:38.861 element at address: 0x20000034e440 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:08:38.861 element at address: 0x20000034e280 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:08:38.861 element at address: 0x20000034e000 with size: 0.000244 MiB 00:08:38.861 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:08:38.862 element at address: 0x20000034a980 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:08:38.862 element at address: 0x20000034a7c0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:08:38.862 element at address: 0x20000034a540 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:08:38.862 element at address: 0x200000346ec0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:08:38.862 element at address: 0x200000346d00 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:08:38.862 element at address: 0x200000346a80 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:08:38.862 element at address: 0x200000343400 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:08:38.862 element at address: 0x200000343240 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:08:38.862 element at address: 0x200000342fc0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:08:38.862 element at address: 0x20000033f940 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:08:38.862 element at address: 0x20000033f780 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:08:38.862 element at address: 0x20000033f500 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:08:38.862 element at address: 0x20000033be80 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:08:38.862 element at address: 0x20000033bcc0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:08:38.862 element at address: 0x20000033ba40 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:08:38.862 element at address: 0x2000003383c0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:08:38.862 element at address: 0x200000338200 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:08:38.862 element at address: 0x200000337f80 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:08:38.862 element at address: 0x200000334900 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:08:38.862 element at address: 0x200000334740 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:08:38.862 element at address: 0x2000003344c0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:08:38.862 element at address: 0x200000330e40 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:08:38.862 element at address: 0x200000330c80 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:08:38.862 element at address: 0x200000330a00 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:08:38.862 element at address: 0x20000032d380 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:08:38.862 element at address: 0x20000032d1c0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:08:38.862 element at address: 0x20000032cf40 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:08:38.862 element at address: 0x2000003298c0 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:08:38.862 element at address: 0x200000329700 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:08:38.862 element at address: 0x200000329480 with size: 0.000244 MiB 00:08:38.862 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:08:38.862 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:08:38.862 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:38.862 07:14:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:08:38.862 07:14:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1544432 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 1544432 ']' 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 1544432 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1544432 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1544432' 00:08:38.862 killing process with pid 1544432 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 1544432 00:08:38.862 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 1544432 00:08:39.122 00:08:39.122 real 0m1.709s 00:08:39.122 user 0m1.891s 00:08:39.122 sys 0m0.542s 00:08:39.122 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:39.122 07:14:11 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:39.122 ************************************ 00:08:39.122 END TEST dpdk_mem_utility 00:08:39.122 ************************************ 00:08:39.389 07:14:11 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:39.389 07:14:11 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:39.389 07:14:11 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:39.389 07:14:11 -- common/autotest_common.sh@10 -- # set +x 00:08:39.389 ************************************ 00:08:39.389 START TEST event 00:08:39.389 ************************************ 00:08:39.389 07:14:11 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:39.389 * Looking for test storage... 00:08:39.389 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:08:39.389 07:14:11 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:39.389 07:14:11 event -- bdev/nbd_common.sh@6 -- # set -e 00:08:39.389 07:14:11 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:39.389 07:14:11 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:39.389 07:14:11 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:39.389 07:14:11 event -- common/autotest_common.sh@10 -- # set +x 00:08:39.389 ************************************ 00:08:39.389 START TEST event_perf 00:08:39.389 ************************************ 00:08:39.389 07:14:11 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:39.389 Running I/O for 1 seconds...[2024-07-25 07:14:11.877284] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:39.389 [2024-07-25 07:14:11.877344] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1544763 ] 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:39.648 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:39.648 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:39.648 [2024-07-25 07:14:12.007689] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:39.648 [2024-07-25 07:14:12.095585] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:39.648 [2024-07-25 07:14:12.095679] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:08:39.648 [2024-07-25 07:14:12.095764] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:08:39.648 [2024-07-25 07:14:12.095768] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.021 Running I/O for 1 seconds... 00:08:41.021 lcore 0: 183871 00:08:41.021 lcore 1: 183869 00:08:41.021 lcore 2: 183869 00:08:41.021 lcore 3: 183871 00:08:41.021 done. 00:08:41.021 00:08:41.021 real 0m1.321s 00:08:41.021 user 0m4.168s 00:08:41.021 sys 0m0.147s 00:08:41.021 07:14:13 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:41.021 07:14:13 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:08:41.021 ************************************ 00:08:41.021 END TEST event_perf 00:08:41.021 ************************************ 00:08:41.021 07:14:13 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:41.021 07:14:13 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:08:41.021 07:14:13 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:41.021 07:14:13 event -- common/autotest_common.sh@10 -- # set +x 00:08:41.021 ************************************ 00:08:41.021 START TEST event_reactor 00:08:41.021 ************************************ 00:08:41.021 07:14:13 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:41.021 [2024-07-25 07:14:13.276980] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:41.021 [2024-07-25 07:14:13.277051] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545045 ] 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.021 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:41.021 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:41.022 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:41.022 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:41.022 [2024-07-25 07:14:13.410145] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.022 [2024-07-25 07:14:13.491974] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.393 test_start 00:08:42.393 oneshot 00:08:42.393 tick 100 00:08:42.393 tick 100 00:08:42.393 tick 250 00:08:42.393 tick 100 00:08:42.393 tick 100 00:08:42.393 tick 100 00:08:42.393 tick 250 00:08:42.393 tick 500 00:08:42.393 tick 100 00:08:42.393 tick 100 00:08:42.393 tick 250 00:08:42.393 tick 100 00:08:42.393 tick 100 00:08:42.393 test_end 00:08:42.393 00:08:42.393 real 0m1.318s 00:08:42.393 user 0m1.178s 00:08:42.393 sys 0m0.134s 00:08:42.393 07:14:14 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:42.393 07:14:14 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:08:42.393 ************************************ 00:08:42.393 END TEST event_reactor 00:08:42.393 ************************************ 00:08:42.393 07:14:14 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:42.393 07:14:14 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:08:42.393 07:14:14 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:42.393 07:14:14 event -- common/autotest_common.sh@10 -- # set +x 00:08:42.393 ************************************ 00:08:42.393 START TEST event_reactor_perf 00:08:42.393 ************************************ 00:08:42.393 07:14:14 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:42.393 [2024-07-25 07:14:14.677837] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:42.393 [2024-07-25 07:14:14.677895] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545307 ] 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:42.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:42.393 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:42.393 [2024-07-25 07:14:14.810098] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.393 [2024-07-25 07:14:14.892593] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.765 test_start 00:08:43.765 test_end 00:08:43.765 Performance: 356609 events per second 00:08:43.765 00:08:43.765 real 0m1.321s 00:08:43.765 user 0m1.179s 00:08:43.765 sys 0m0.136s 00:08:43.765 07:14:15 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:43.765 07:14:15 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:08:43.765 ************************************ 00:08:43.765 END TEST event_reactor_perf 00:08:43.765 ************************************ 00:08:43.765 07:14:16 event -- event/event.sh@49 -- # uname -s 00:08:43.765 07:14:16 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:08:43.765 07:14:16 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:43.765 07:14:16 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:43.765 07:14:16 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:43.765 07:14:16 event -- common/autotest_common.sh@10 -- # set +x 00:08:43.765 ************************************ 00:08:43.765 START TEST event_scheduler 00:08:43.765 ************************************ 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:43.766 * Looking for test storage... 00:08:43.766 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:08:43.766 07:14:16 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:08:43.766 07:14:16 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1545608 00:08:43.766 07:14:16 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:08:43.766 07:14:16 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:08:43.766 07:14:16 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1545608 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 1545608 ']' 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:43.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:43.766 07:14:16 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:43.766 [2024-07-25 07:14:16.219186] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:43.766 [2024-07-25 07:14:16.219250] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545608 ] 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:43.766 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:43.766 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:44.025 [2024-07-25 07:14:16.323510] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:44.025 [2024-07-25 07:14:16.401838] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.025 [2024-07-25 07:14:16.401922] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:44.025 [2024-07-25 07:14:16.402006] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:08:44.025 [2024-07-25 07:14:16.402008] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:08:44.959 07:14:17 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:44.959 07:14:17 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:08:44.959 07:14:17 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:08:44.959 07:14:17 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.959 07:14:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:44.959 [2024-07-25 07:14:17.140612] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:08:44.959 [2024-07-25 07:14:17.140632] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:08:44.959 [2024-07-25 07:14:17.140643] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:08:44.960 [2024-07-25 07:14:17.140651] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:08:44.960 [2024-07-25 07:14:17.140658] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 [2024-07-25 07:14:17.223513] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 ************************************ 00:08:44.960 START TEST scheduler_create_thread 00:08:44.960 ************************************ 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 2 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 3 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 4 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 5 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 6 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 7 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 8 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 9 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 10 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.960 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:45.527 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:45.527 07:14:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:08:45.527 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:45.527 07:14:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:46.902 07:14:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:46.902 07:14:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:08:46.902 07:14:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:08:46.902 07:14:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:46.902 07:14:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:47.837 07:14:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:47.837 00:08:47.837 real 0m3.100s 00:08:47.837 user 0m0.025s 00:08:47.837 sys 0m0.006s 00:08:47.837 07:14:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:47.837 07:14:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:47.837 ************************************ 00:08:47.837 END TEST scheduler_create_thread 00:08:47.837 ************************************ 00:08:48.096 07:14:20 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:08:48.096 07:14:20 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1545608 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 1545608 ']' 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 1545608 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1545608 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1545608' 00:08:48.096 killing process with pid 1545608 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 1545608 00:08:48.096 07:14:20 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 1545608 00:08:48.354 [2024-07-25 07:14:20.742795] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:08:48.613 00:08:48.613 real 0m4.901s 00:08:48.613 user 0m9.592s 00:08:48.613 sys 0m0.496s 00:08:48.613 07:14:20 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:48.613 07:14:20 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:48.613 ************************************ 00:08:48.613 END TEST event_scheduler 00:08:48.613 ************************************ 00:08:48.613 07:14:20 event -- event/event.sh@51 -- # modprobe -n nbd 00:08:48.613 07:14:21 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:08:48.613 07:14:21 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:48.613 07:14:21 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:48.613 07:14:21 event -- common/autotest_common.sh@10 -- # set +x 00:08:48.613 ************************************ 00:08:48.613 START TEST app_repeat 00:08:48.613 ************************************ 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1546483 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1546483' 00:08:48.613 Process app_repeat pid: 1546483 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:08:48.613 spdk_app_start Round 0 00:08:48.613 07:14:21 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1546483 /var/tmp/spdk-nbd.sock 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1546483 ']' 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:48.613 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:48.613 07:14:21 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:48.613 [2024-07-25 07:14:21.092188] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:08:48.613 [2024-07-25 07:14:21.092250] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546483 ] 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.0 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.1 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.2 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.3 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.4 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.5 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.6 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:01.7 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.0 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.1 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.2 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.3 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.4 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.5 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.6 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3d:02.7 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.0 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.1 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.2 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.3 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.4 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.5 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.6 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:01.7 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.0 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.1 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.2 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.3 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.4 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.5 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.6 cannot be used 00:08:48.872 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:08:48.872 EAL: Requested device 0000:3f:02.7 cannot be used 00:08:48.872 [2024-07-25 07:14:21.227019] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:48.872 [2024-07-25 07:14:21.312135] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:48.872 [2024-07-25 07:14:21.312146] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.807 07:14:22 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:49.807 07:14:22 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:08:49.807 07:14:22 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:49.807 Malloc0 00:08:49.807 07:14:22 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:50.066 Malloc1 00:08:50.066 07:14:22 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:50.066 07:14:22 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:50.325 /dev/nbd0 00:08:50.325 07:14:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:50.325 07:14:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:50.325 1+0 records in 00:08:50.325 1+0 records out 00:08:50.325 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258705 s, 15.8 MB/s 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:50.325 07:14:22 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:08:50.325 07:14:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:50.325 07:14:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:50.325 07:14:22 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:50.584 /dev/nbd1 00:08:50.584 07:14:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:50.584 07:14:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:50.584 07:14:22 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:08:50.584 07:14:22 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:08:50.584 07:14:22 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:50.584 07:14:22 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:50.584 07:14:22 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:50.584 1+0 records in 00:08:50.584 1+0 records out 00:08:50.584 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260072 s, 15.7 MB/s 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:50.584 07:14:23 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:08:50.584 07:14:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:50.584 07:14:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:50.584 07:14:23 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:50.584 07:14:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.584 07:14:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:50.843 { 00:08:50.843 "nbd_device": "/dev/nbd0", 00:08:50.843 "bdev_name": "Malloc0" 00:08:50.843 }, 00:08:50.843 { 00:08:50.843 "nbd_device": "/dev/nbd1", 00:08:50.843 "bdev_name": "Malloc1" 00:08:50.843 } 00:08:50.843 ]' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:50.843 { 00:08:50.843 "nbd_device": "/dev/nbd0", 00:08:50.843 "bdev_name": "Malloc0" 00:08:50.843 }, 00:08:50.843 { 00:08:50.843 "nbd_device": "/dev/nbd1", 00:08:50.843 "bdev_name": "Malloc1" 00:08:50.843 } 00:08:50.843 ]' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:50.843 /dev/nbd1' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:50.843 /dev/nbd1' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:50.843 256+0 records in 00:08:50.843 256+0 records out 00:08:50.843 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112564 s, 93.2 MB/s 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:50.843 256+0 records in 00:08:50.843 256+0 records out 00:08:50.843 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0275419 s, 38.1 MB/s 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:50.843 256+0 records in 00:08:50.843 256+0 records out 00:08:50.843 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0277684 s, 37.8 MB/s 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.843 07:14:23 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.106 07:14:23 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.369 07:14:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:51.627 07:14:24 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:51.627 07:14:24 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:51.886 07:14:24 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:52.144 [2024-07-25 07:14:24.611044] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:52.403 [2024-07-25 07:14:24.688131] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:52.403 [2024-07-25 07:14:24.688135] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.403 [2024-07-25 07:14:24.732598] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:52.403 [2024-07-25 07:14:24.732644] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:54.972 07:14:27 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:54.972 07:14:27 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:08:54.972 spdk_app_start Round 1 00:08:54.972 07:14:27 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1546483 /var/tmp/spdk-nbd.sock 00:08:54.972 07:14:27 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1546483 ']' 00:08:54.972 07:14:27 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:54.972 07:14:27 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:54.972 07:14:27 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:54.972 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:54.972 07:14:27 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:54.972 07:14:27 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:55.230 07:14:27 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:55.230 07:14:27 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:08:55.230 07:14:27 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:55.488 Malloc0 00:08:55.488 07:14:27 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:55.746 Malloc1 00:08:55.746 07:14:28 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:55.746 07:14:28 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:56.004 /dev/nbd0 00:08:56.004 07:14:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:56.004 07:14:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:56.004 07:14:28 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:56.005 07:14:28 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:56.005 1+0 records in 00:08:56.005 1+0 records out 00:08:56.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246865 s, 16.6 MB/s 00:08:56.005 07:14:28 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.005 07:14:28 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:08:56.005 07:14:28 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.005 07:14:28 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:56.005 07:14:28 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:08:56.005 07:14:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.005 07:14:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:56.005 07:14:28 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:56.263 /dev/nbd1 00:08:56.263 07:14:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:56.263 07:14:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:56.263 1+0 records in 00:08:56.263 1+0 records out 00:08:56.263 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271924 s, 15.1 MB/s 00:08:56.263 07:14:28 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.264 07:14:28 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:08:56.264 07:14:28 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.264 07:14:28 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:56.264 07:14:28 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:08:56.264 07:14:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.264 07:14:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:56.264 07:14:28 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:56.264 07:14:28 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.264 07:14:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:56.522 { 00:08:56.522 "nbd_device": "/dev/nbd0", 00:08:56.522 "bdev_name": "Malloc0" 00:08:56.522 }, 00:08:56.522 { 00:08:56.522 "nbd_device": "/dev/nbd1", 00:08:56.522 "bdev_name": "Malloc1" 00:08:56.522 } 00:08:56.522 ]' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:56.522 { 00:08:56.522 "nbd_device": "/dev/nbd0", 00:08:56.522 "bdev_name": "Malloc0" 00:08:56.522 }, 00:08:56.522 { 00:08:56.522 "nbd_device": "/dev/nbd1", 00:08:56.522 "bdev_name": "Malloc1" 00:08:56.522 } 00:08:56.522 ]' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:56.522 /dev/nbd1' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:56.522 /dev/nbd1' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:56.522 256+0 records in 00:08:56.522 256+0 records out 00:08:56.522 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106315 s, 98.6 MB/s 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:56.522 256+0 records in 00:08:56.522 256+0 records out 00:08:56.522 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.016951 s, 61.9 MB/s 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:56.522 256+0 records in 00:08:56.522 256+0 records out 00:08:56.522 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0289079 s, 36.3 MB/s 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.522 07:14:28 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:56.523 07:14:28 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:56.523 07:14:28 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:56.523 07:14:28 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.523 07:14:28 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:56.780 07:14:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:56.780 07:14:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:56.780 07:14:29 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:56.780 07:14:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:56.780 07:14:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:56.781 07:14:29 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:56.781 07:14:29 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:56.781 07:14:29 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:56.781 07:14:29 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.781 07:14:29 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:57.038 07:14:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:57.296 07:14:29 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:57.297 07:14:29 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:57.863 07:14:30 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:58.121 [2024-07-25 07:14:30.543860] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:58.122 [2024-07-25 07:14:30.621679] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:58.122 [2024-07-25 07:14:30.621684] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.379 [2024-07-25 07:14:30.667222] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:58.379 [2024-07-25 07:14:30.667266] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:00.909 07:14:33 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:09:00.909 07:14:33 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:09:00.909 spdk_app_start Round 2 00:09:00.909 07:14:33 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1546483 /var/tmp/spdk-nbd.sock 00:09:00.909 07:14:33 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1546483 ']' 00:09:00.909 07:14:33 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:00.909 07:14:33 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:00.909 07:14:33 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:00.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:00.909 07:14:33 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:00.909 07:14:33 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:01.476 07:14:33 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:01.476 07:14:33 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:09:01.476 07:14:33 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:01.735 Malloc0 00:09:01.735 07:14:34 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:01.994 Malloc1 00:09:01.994 07:14:34 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:01.994 07:14:34 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:09:01.995 07:14:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:01.995 07:14:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:01.995 07:14:34 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:02.254 /dev/nbd0 00:09:02.254 07:14:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:02.254 07:14:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:02.254 1+0 records in 00:09:02.254 1+0 records out 00:09:02.254 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263696 s, 15.5 MB/s 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:02.254 07:14:34 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:09:02.254 07:14:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:02.254 07:14:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:02.254 07:14:34 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:09:02.512 /dev/nbd1 00:09:02.512 07:14:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:02.512 07:14:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:02.512 1+0 records in 00:09:02.512 1+0 records out 00:09:02.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250036 s, 16.4 MB/s 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:02.512 07:14:34 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:09:02.513 07:14:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:02.513 07:14:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:02.513 07:14:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:02.513 07:14:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:02.513 07:14:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:02.513 07:14:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:02.513 { 00:09:02.513 "nbd_device": "/dev/nbd0", 00:09:02.513 "bdev_name": "Malloc0" 00:09:02.513 }, 00:09:02.513 { 00:09:02.513 "nbd_device": "/dev/nbd1", 00:09:02.513 "bdev_name": "Malloc1" 00:09:02.513 } 00:09:02.513 ]' 00:09:02.513 07:14:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:02.513 { 00:09:02.513 "nbd_device": "/dev/nbd0", 00:09:02.513 "bdev_name": "Malloc0" 00:09:02.513 }, 00:09:02.513 { 00:09:02.513 "nbd_device": "/dev/nbd1", 00:09:02.513 "bdev_name": "Malloc1" 00:09:02.513 } 00:09:02.513 ]' 00:09:02.513 07:14:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:02.772 /dev/nbd1' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:02.772 /dev/nbd1' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:09:02.772 256+0 records in 00:09:02.772 256+0 records out 00:09:02.772 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113112 s, 92.7 MB/s 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:02.772 256+0 records in 00:09:02.772 256+0 records out 00:09:02.772 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0172641 s, 60.7 MB/s 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:02.772 256+0 records in 00:09:02.772 256+0 records out 00:09:02.772 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0179333 s, 58.5 MB/s 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.772 07:14:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:03.031 07:14:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:03.290 07:14:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:03.549 07:14:35 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:09:03.549 07:14:35 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:09:03.549 07:14:36 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:09:03.808 [2024-07-25 07:14:36.316076] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:04.066 [2024-07-25 07:14:36.396661] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:04.066 [2024-07-25 07:14:36.396665] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.066 [2024-07-25 07:14:36.441130] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:09:04.066 [2024-07-25 07:14:36.441187] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:06.601 07:14:39 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1546483 /var/tmp/spdk-nbd.sock 00:09:06.601 07:14:39 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1546483 ']' 00:09:06.601 07:14:39 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:06.601 07:14:39 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:06.601 07:14:39 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:06.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:06.601 07:14:39 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:06.601 07:14:39 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:09:06.860 07:14:39 event.app_repeat -- event/event.sh@39 -- # killprocess 1546483 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 1546483 ']' 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 1546483 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1546483 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1546483' 00:09:06.860 killing process with pid 1546483 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@969 -- # kill 1546483 00:09:06.860 07:14:39 event.app_repeat -- common/autotest_common.sh@974 -- # wait 1546483 00:09:07.119 spdk_app_start is called in Round 0. 00:09:07.119 Shutdown signal received, stop current app iteration 00:09:07.119 Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 reinitialization... 00:09:07.119 spdk_app_start is called in Round 1. 00:09:07.119 Shutdown signal received, stop current app iteration 00:09:07.119 Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 reinitialization... 00:09:07.119 spdk_app_start is called in Round 2. 00:09:07.119 Shutdown signal received, stop current app iteration 00:09:07.119 Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 reinitialization... 00:09:07.119 spdk_app_start is called in Round 3. 00:09:07.119 Shutdown signal received, stop current app iteration 00:09:07.119 07:14:39 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:09:07.119 07:14:39 event.app_repeat -- event/event.sh@42 -- # return 0 00:09:07.119 00:09:07.119 real 0m18.499s 00:09:07.119 user 0m40.226s 00:09:07.119 sys 0m3.670s 00:09:07.119 07:14:39 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:07.119 07:14:39 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:07.119 ************************************ 00:09:07.119 END TEST app_repeat 00:09:07.119 ************************************ 00:09:07.119 07:14:39 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:09:07.119 00:09:07.119 real 0m27.871s 00:09:07.119 user 0m56.528s 00:09:07.119 sys 0m4.949s 00:09:07.119 07:14:39 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:07.119 07:14:39 event -- common/autotest_common.sh@10 -- # set +x 00:09:07.119 ************************************ 00:09:07.119 END TEST event 00:09:07.119 ************************************ 00:09:07.119 07:14:39 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:07.119 07:14:39 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:07.119 07:14:39 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:07.119 07:14:39 -- common/autotest_common.sh@10 -- # set +x 00:09:07.381 ************************************ 00:09:07.381 START TEST thread 00:09:07.381 ************************************ 00:09:07.381 07:14:39 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:07.381 * Looking for test storage... 00:09:07.381 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:09:07.381 07:14:39 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:07.381 07:14:39 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:09:07.381 07:14:39 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:07.381 07:14:39 thread -- common/autotest_common.sh@10 -- # set +x 00:09:07.381 ************************************ 00:09:07.381 START TEST thread_poller_perf 00:09:07.381 ************************************ 00:09:07.381 07:14:39 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:07.381 [2024-07-25 07:14:39.828299] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:07.381 [2024-07-25 07:14:39.828354] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1549890 ] 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:07.381 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.381 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:07.640 [2024-07-25 07:14:39.960825] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.640 [2024-07-25 07:14:40.048380] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.640 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:09:09.016 ====================================== 00:09:09.016 busy:2515635164 (cyc) 00:09:09.016 total_run_count: 285000 00:09:09.016 tsc_hz: 2500000000 (cyc) 00:09:09.016 ====================================== 00:09:09.016 poller_cost: 8826 (cyc), 3530 (nsec) 00:09:09.016 00:09:09.016 real 0m1.323s 00:09:09.016 user 0m1.186s 00:09:09.016 sys 0m0.130s 00:09:09.016 07:14:41 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:09.016 07:14:41 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:09.016 ************************************ 00:09:09.016 END TEST thread_poller_perf 00:09:09.016 ************************************ 00:09:09.016 07:14:41 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:09.016 07:14:41 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:09:09.016 07:14:41 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:09.016 07:14:41 thread -- common/autotest_common.sh@10 -- # set +x 00:09:09.016 ************************************ 00:09:09.016 START TEST thread_poller_perf 00:09:09.016 ************************************ 00:09:09.016 07:14:41 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:09.016 [2024-07-25 07:14:41.244663] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:09.016 [2024-07-25 07:14:41.244738] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550178 ] 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:09.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:09.016 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:09.016 [2024-07-25 07:14:41.377260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.016 [2024-07-25 07:14:41.458974] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.016 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:09:10.417 ====================================== 00:09:10.418 busy:2502385748 (cyc) 00:09:10.418 total_run_count: 3799000 00:09:10.418 tsc_hz: 2500000000 (cyc) 00:09:10.418 ====================================== 00:09:10.418 poller_cost: 658 (cyc), 263 (nsec) 00:09:10.418 00:09:10.418 real 0m1.319s 00:09:10.418 user 0m1.174s 00:09:10.418 sys 0m0.140s 00:09:10.418 07:14:42 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:10.418 07:14:42 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:10.418 ************************************ 00:09:10.418 END TEST thread_poller_perf 00:09:10.418 ************************************ 00:09:10.418 07:14:42 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:09:10.418 00:09:10.418 real 0m2.909s 00:09:10.418 user 0m2.461s 00:09:10.418 sys 0m0.458s 00:09:10.418 07:14:42 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:10.418 07:14:42 thread -- common/autotest_common.sh@10 -- # set +x 00:09:10.418 ************************************ 00:09:10.418 END TEST thread 00:09:10.418 ************************************ 00:09:10.418 07:14:42 -- spdk/autotest.sh@184 -- # [[ 1 -eq 1 ]] 00:09:10.418 07:14:42 -- spdk/autotest.sh@185 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:10.418 07:14:42 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:10.418 07:14:42 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:10.418 07:14:42 -- common/autotest_common.sh@10 -- # set +x 00:09:10.418 ************************************ 00:09:10.418 START TEST accel 00:09:10.418 ************************************ 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:10.418 * Looking for test storage... 00:09:10.418 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:10.418 07:14:42 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:09:10.418 07:14:42 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:09:10.418 07:14:42 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:10.418 07:14:42 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1550498 00:09:10.418 07:14:42 accel -- accel/accel.sh@63 -- # waitforlisten 1550498 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@831 -- # '[' -z 1550498 ']' 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:10.418 07:14:42 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:10.418 07:14:42 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:10.418 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:10.418 07:14:42 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:10.418 07:14:42 accel -- common/autotest_common.sh@10 -- # set +x 00:09:10.418 07:14:42 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:10.418 07:14:42 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:10.418 07:14:42 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:10.418 07:14:42 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:10.418 07:14:42 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:10.418 07:14:42 accel -- accel/accel.sh@41 -- # jq -r . 00:09:10.418 [2024-07-25 07:14:42.833697] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:10.418 [2024-07-25 07:14:42.833758] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550498 ] 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:10.418 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.418 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:10.419 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.419 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:10.419 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.419 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:10.419 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:10.419 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:10.678 [2024-07-25 07:14:42.966552] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.678 [2024-07-25 07:14:43.049791] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.261 07:14:43 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:11.261 07:14:43 accel -- common/autotest_common.sh@864 -- # return 0 00:09:11.261 07:14:43 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:11.261 07:14:43 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:11.261 07:14:43 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:11.261 07:14:43 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:09:11.261 07:14:43 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:11.262 07:14:43 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:11.262 07:14:43 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@10 -- # set +x 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # IFS== 00:09:11.262 07:14:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:11.262 07:14:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:11.262 07:14:43 accel -- accel/accel.sh@75 -- # killprocess 1550498 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@950 -- # '[' -z 1550498 ']' 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@954 -- # kill -0 1550498 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@955 -- # uname 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:11.262 07:14:43 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1550498 00:09:11.521 07:14:43 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:11.521 07:14:43 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:11.521 07:14:43 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1550498' 00:09:11.521 killing process with pid 1550498 00:09:11.521 07:14:43 accel -- common/autotest_common.sh@969 -- # kill 1550498 00:09:11.521 07:14:43 accel -- common/autotest_common.sh@974 -- # wait 1550498 00:09:11.780 07:14:44 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:11.780 07:14:44 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:09:11.780 07:14:44 accel -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:11.780 07:14:44 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:11.780 07:14:44 accel -- common/autotest_common.sh@10 -- # set +x 00:09:11.780 07:14:44 accel.accel_help -- common/autotest_common.sh@1125 -- # accel_perf -h 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:09:11.780 07:14:44 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:09:11.780 07:14:44 accel.accel_help -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:11.780 07:14:44 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:09:11.780 07:14:44 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:09:11.780 07:14:44 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:11.780 07:14:44 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:11.780 07:14:44 accel -- common/autotest_common.sh@10 -- # set +x 00:09:11.780 ************************************ 00:09:11.780 START TEST accel_missing_filename 00:09:11.780 ************************************ 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # local es=0 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # type -t accel_perf 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:12.041 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:09:12.041 07:14:44 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:09:12.041 [2024-07-25 07:14:44.352335] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:12.041 [2024-07-25 07:14:44.352393] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550803 ] 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:12.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.041 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:12.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.042 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:12.042 [2024-07-25 07:14:44.483850] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.042 [2024-07-25 07:14:44.566802] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.301 [2024-07-25 07:14:44.622258] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:12.301 [2024-07-25 07:14:44.684856] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:09:12.301 A filename is required. 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # es=234 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # es=106 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@663 -- # case "$es" in 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@670 -- # es=1 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:12.301 00:09:12.301 real 0m0.450s 00:09:12.301 user 0m0.289s 00:09:12.301 sys 0m0.192s 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:12.301 07:14:44 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:09:12.301 ************************************ 00:09:12.301 END TEST accel_missing_filename 00:09:12.301 ************************************ 00:09:12.301 07:14:44 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.301 07:14:44 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:09:12.301 07:14:44 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:12.301 07:14:44 accel -- common/autotest_common.sh@10 -- # set +x 00:09:12.560 ************************************ 00:09:12.560 START TEST accel_compress_verify 00:09:12.560 ************************************ 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # local es=0 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # type -t accel_perf 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:12.560 07:14:44 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:12.560 07:14:44 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:09:12.560 [2024-07-25 07:14:44.882055] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:12.560 [2024-07-25 07:14:44.882111] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550826 ] 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:12.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:12.560 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:12.560 [2024-07-25 07:14:45.012966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.819 [2024-07-25 07:14:45.096573] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.819 [2024-07-25 07:14:45.156097] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:12.819 [2024-07-25 07:14:45.220189] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:09:12.819 00:09:12.819 Compression does not support the verify option, aborting. 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # es=161 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # es=33 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@663 -- # case "$es" in 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@670 -- # es=1 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:12.819 00:09:12.819 real 0m0.455s 00:09:12.819 user 0m0.299s 00:09:12.819 sys 0m0.184s 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:12.819 07:14:45 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:09:12.819 ************************************ 00:09:12.819 END TEST accel_compress_verify 00:09:12.819 ************************************ 00:09:12.820 07:14:45 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:09:12.820 07:14:45 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:12.820 07:14:45 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:12.820 07:14:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:13.078 ************************************ 00:09:13.078 START TEST accel_wrong_workload 00:09:13.078 ************************************ 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w foobar 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # local es=0 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # type -t accel_perf 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:09:13.078 07:14:45 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:09:13.078 Unsupported workload type: foobar 00:09:13.078 [2024-07-25 07:14:45.416168] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:09:13.078 accel_perf options: 00:09:13.078 [-h help message] 00:09:13.078 [-q queue depth per core] 00:09:13.078 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:13.078 [-T number of threads per core 00:09:13.078 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:13.078 [-t time in seconds] 00:09:13.078 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:13.078 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:13.078 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:13.078 [-l for compress/decompress workloads, name of uncompressed input file 00:09:13.078 [-S for crc32c workload, use this seed value (default 0) 00:09:13.078 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:13.078 [-f for fill workload, use this BYTE value (default 255) 00:09:13.078 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:13.078 [-y verify result if this switch is on] 00:09:13.078 [-a tasks to allocate per core (default: same value as -q)] 00:09:13.078 Can be used to spread operations across a wider range of memory. 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # es=1 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:13.078 00:09:13.078 real 0m0.042s 00:09:13.078 user 0m0.027s 00:09:13.078 sys 0m0.014s 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:13.078 07:14:45 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:09:13.078 ************************************ 00:09:13.078 END TEST accel_wrong_workload 00:09:13.078 ************************************ 00:09:13.078 Error: writing output failed: Broken pipe 00:09:13.078 07:14:45 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:09:13.078 07:14:45 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:09:13.078 07:14:45 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:13.078 07:14:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:13.078 ************************************ 00:09:13.078 START TEST accel_negative_buffers 00:09:13.078 ************************************ 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # local es=0 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # type -t accel_perf 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:09:13.078 07:14:45 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:09:13.078 -x option must be non-negative. 00:09:13.078 [2024-07-25 07:14:45.535883] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:09:13.078 accel_perf options: 00:09:13.078 [-h help message] 00:09:13.078 [-q queue depth per core] 00:09:13.078 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:13.078 [-T number of threads per core 00:09:13.078 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:13.078 [-t time in seconds] 00:09:13.078 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:13.078 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:13.078 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:13.078 [-l for compress/decompress workloads, name of uncompressed input file 00:09:13.078 [-S for crc32c workload, use this seed value (default 0) 00:09:13.078 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:13.078 [-f for fill workload, use this BYTE value (default 255) 00:09:13.078 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:13.078 [-y verify result if this switch is on] 00:09:13.078 [-a tasks to allocate per core (default: same value as -q)] 00:09:13.078 Can be used to spread operations across a wider range of memory. 00:09:13.078 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # es=1 00:09:13.079 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:13.079 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:13.079 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:13.079 00:09:13.079 real 0m0.043s 00:09:13.079 user 0m0.025s 00:09:13.079 sys 0m0.018s 00:09:13.079 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:13.079 07:14:45 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:09:13.079 ************************************ 00:09:13.079 END TEST accel_negative_buffers 00:09:13.079 ************************************ 00:09:13.079 Error: writing output failed: Broken pipe 00:09:13.079 07:14:45 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:09:13.079 07:14:45 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:09:13.079 07:14:45 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:13.079 07:14:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:13.079 ************************************ 00:09:13.079 START TEST accel_crc32c 00:09:13.079 ************************************ 00:09:13.079 07:14:45 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -S 32 -y 00:09:13.079 07:14:45 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:13.079 07:14:45 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:13.079 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.079 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.079 07:14:45 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:09:13.337 07:14:45 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:13.338 07:14:45 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:13.338 [2024-07-25 07:14:45.641278] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:13.338 [2024-07-25 07:14:45.641338] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551058 ] 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:13.338 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.338 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:13.338 [2024-07-25 07:14:45.773753] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.338 [2024-07-25 07:14:45.855668] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:13.597 07:14:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:14.534 07:14:47 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:14.534 00:09:14.534 real 0m1.452s 00:09:14.534 user 0m1.276s 00:09:14.534 sys 0m0.182s 00:09:14.534 07:14:47 accel.accel_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:14.534 07:14:47 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:14.534 ************************************ 00:09:14.534 END TEST accel_crc32c 00:09:14.534 ************************************ 00:09:14.793 07:14:47 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:09:14.793 07:14:47 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:09:14.793 07:14:47 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:14.793 07:14:47 accel -- common/autotest_common.sh@10 -- # set +x 00:09:14.793 ************************************ 00:09:14.793 START TEST accel_crc32c_C2 00:09:14.793 ************************************ 00:09:14.793 07:14:47 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -y -C 2 00:09:14.793 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:14.793 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:14.793 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:14.793 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:14.793 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:14.794 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:14.794 [2024-07-25 07:14:47.175190] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:14.794 [2024-07-25 07:14:47.175243] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551332 ] 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:14.794 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.794 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:14.794 [2024-07-25 07:14:47.305539] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.053 [2024-07-25 07:14:47.389229] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.053 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:15.054 07:14:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.430 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:16.431 00:09:16.431 real 0m1.458s 00:09:16.431 user 0m1.279s 00:09:16.431 sys 0m0.186s 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:16.431 07:14:48 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:16.431 ************************************ 00:09:16.431 END TEST accel_crc32c_C2 00:09:16.431 ************************************ 00:09:16.431 07:14:48 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:09:16.431 07:14:48 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:16.431 07:14:48 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:16.431 07:14:48 accel -- common/autotest_common.sh@10 -- # set +x 00:09:16.431 ************************************ 00:09:16.431 START TEST accel_copy 00:09:16.431 ************************************ 00:09:16.431 07:14:48 accel.accel_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy -y 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:16.431 07:14:48 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:09:16.431 [2024-07-25 07:14:48.715944] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:16.431 [2024-07-25 07:14:48.715998] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551608 ] 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:16.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:16.431 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:16.431 [2024-07-25 07:14:48.844951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.432 [2024-07-25 07:14:48.926946] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.690 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:16.691 07:14:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:09:17.628 07:14:50 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:17.628 00:09:17.628 real 0m1.452s 00:09:17.628 user 0m1.274s 00:09:17.628 sys 0m0.182s 00:09:17.628 07:14:50 accel.accel_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:17.628 07:14:50 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:09:17.628 ************************************ 00:09:17.628 END TEST accel_copy 00:09:17.628 ************************************ 00:09:17.887 07:14:50 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:17.887 07:14:50 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:09:17.887 07:14:50 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:17.887 07:14:50 accel -- common/autotest_common.sh@10 -- # set +x 00:09:17.887 ************************************ 00:09:17.887 START TEST accel_fill 00:09:17.887 ************************************ 00:09:17.887 07:14:50 accel.accel_fill -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:09:17.887 07:14:50 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:09:17.887 [2024-07-25 07:14:50.253347] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:17.887 [2024-07-25 07:14:50.253402] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551876 ] 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:17.887 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.887 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:17.887 [2024-07-25 07:14:50.382332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:18.146 [2024-07-25 07:14:50.464977] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.146 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:18.147 07:14:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:09:19.524 07:14:51 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:19.524 00:09:19.524 real 0m1.453s 00:09:19.524 user 0m1.271s 00:09:19.524 sys 0m0.187s 00:09:19.524 07:14:51 accel.accel_fill -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:19.524 07:14:51 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:09:19.524 ************************************ 00:09:19.524 END TEST accel_fill 00:09:19.524 ************************************ 00:09:19.524 07:14:51 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:09:19.524 07:14:51 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:19.524 07:14:51 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:19.524 07:14:51 accel -- common/autotest_common.sh@10 -- # set +x 00:09:19.524 ************************************ 00:09:19.524 START TEST accel_copy_crc32c 00:09:19.524 ************************************ 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:19.524 07:14:51 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:19.524 [2024-07-25 07:14:51.797612] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:19.524 [2024-07-25 07:14:51.797736] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552149 ] 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.524 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:19.524 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:19.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:19.525 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:19.525 [2024-07-25 07:14:52.004741] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.784 [2024-07-25 07:14:52.091997] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:19.784 07:14:52 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:21.161 00:09:21.161 real 0m1.554s 00:09:21.161 user 0m1.313s 00:09:21.161 sys 0m0.246s 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:21.161 07:14:53 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:21.161 ************************************ 00:09:21.161 END TEST accel_copy_crc32c 00:09:21.161 ************************************ 00:09:21.161 07:14:53 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:09:21.161 07:14:53 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:09:21.161 07:14:53 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:21.161 07:14:53 accel -- common/autotest_common.sh@10 -- # set +x 00:09:21.161 ************************************ 00:09:21.161 START TEST accel_copy_crc32c_C2 00:09:21.161 ************************************ 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:21.161 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:21.161 [2024-07-25 07:14:53.427681] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:21.161 [2024-07-25 07:14:53.427805] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552451 ] 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:21.161 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.161 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:21.162 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:21.162 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:21.162 [2024-07-25 07:14:53.629517] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:21.420 [2024-07-25 07:14:53.712704] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:21.420 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:21.421 07:14:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:22.794 00:09:22.794 real 0m1.538s 00:09:22.794 user 0m1.299s 00:09:22.794 sys 0m0.243s 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:22.794 07:14:54 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:22.794 ************************************ 00:09:22.794 END TEST accel_copy_crc32c_C2 00:09:22.794 ************************************ 00:09:22.794 07:14:54 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:09:22.794 07:14:54 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:22.794 07:14:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:22.794 07:14:54 accel -- common/autotest_common.sh@10 -- # set +x 00:09:22.794 ************************************ 00:09:22.794 START TEST accel_dualcast 00:09:22.794 ************************************ 00:09:22.794 07:14:54 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dualcast -y 00:09:22.794 07:14:54 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:09:22.794 07:14:54 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:09:22.794 07:14:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:09:22.794 07:14:55 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:09:22.795 [2024-07-25 07:14:55.030569] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:22.795 [2024-07-25 07:14:55.030626] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552775 ] 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:22.795 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.795 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:22.795 [2024-07-25 07:14:55.163927] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.795 [2024-07-25 07:14:55.246050] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.795 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:22.796 07:14:55 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:09:24.169 07:14:56 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:24.169 00:09:24.169 real 0m1.465s 00:09:24.169 user 0m1.279s 00:09:24.169 sys 0m0.184s 00:09:24.169 07:14:56 accel.accel_dualcast -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:24.169 07:14:56 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:09:24.169 ************************************ 00:09:24.169 END TEST accel_dualcast 00:09:24.169 ************************************ 00:09:24.169 07:14:56 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:09:24.169 07:14:56 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:24.169 07:14:56 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:24.169 07:14:56 accel -- common/autotest_common.sh@10 -- # set +x 00:09:24.169 ************************************ 00:09:24.169 START TEST accel_compare 00:09:24.169 ************************************ 00:09:24.169 07:14:56 accel.accel_compare -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compare -y 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:09:24.169 07:14:56 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:09:24.169 [2024-07-25 07:14:56.574976] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:24.169 [2024-07-25 07:14:56.575032] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1553050 ] 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.169 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:24.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:24.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:24.170 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:24.428 [2024-07-25 07:14:56.706904] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:24.428 [2024-07-25 07:14:56.790086] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.428 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:24.429 07:14:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:09:25.806 07:14:57 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:25.806 00:09:25.806 real 0m1.454s 00:09:25.806 user 0m1.276s 00:09:25.806 sys 0m0.185s 00:09:25.806 07:14:57 accel.accel_compare -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:25.806 07:14:57 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:09:25.806 ************************************ 00:09:25.806 END TEST accel_compare 00:09:25.806 ************************************ 00:09:25.806 07:14:58 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:09:25.806 07:14:58 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:25.806 07:14:58 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:25.806 07:14:58 accel -- common/autotest_common.sh@10 -- # set +x 00:09:25.806 ************************************ 00:09:25.806 START TEST accel_xor 00:09:25.806 ************************************ 00:09:25.806 07:14:58 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:09:25.806 07:14:58 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:09:25.806 [2024-07-25 07:14:58.082873] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:25.806 [2024-07-25 07:14:58.082927] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1553315 ] 00:09:25.806 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.806 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:25.806 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.806 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:25.806 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:25.807 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:25.807 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:25.807 [2024-07-25 07:14:58.214586] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.807 [2024-07-25 07:14:58.297014] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.066 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:26.067 07:14:58 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:27.002 07:14:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:27.002 00:09:27.002 real 0m1.453s 00:09:27.002 user 0m1.278s 00:09:27.002 sys 0m0.173s 00:09:27.002 07:14:59 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:27.002 07:14:59 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:27.002 ************************************ 00:09:27.002 END TEST accel_xor 00:09:27.002 ************************************ 00:09:27.260 07:14:59 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:09:27.260 07:14:59 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:09:27.260 07:14:59 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:27.260 07:14:59 accel -- common/autotest_common.sh@10 -- # set +x 00:09:27.260 ************************************ 00:09:27.260 START TEST accel_xor 00:09:27.260 ************************************ 00:09:27.260 07:14:59 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y -x 3 00:09:27.260 07:14:59 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:09:27.260 07:14:59 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:09:27.260 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:09:27.261 07:14:59 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:09:27.261 [2024-07-25 07:14:59.628100] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:27.261 [2024-07-25 07:14:59.628160] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1553588 ] 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:27.261 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:27.261 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:27.261 [2024-07-25 07:14:59.757050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:27.520 [2024-07-25 07:14:59.844272] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:27.520 07:14:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:28.897 07:15:01 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:28.897 00:09:28.897 real 0m1.464s 00:09:28.897 user 0m1.280s 00:09:28.897 sys 0m0.188s 00:09:28.897 07:15:01 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:28.897 07:15:01 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:28.897 ************************************ 00:09:28.897 END TEST accel_xor 00:09:28.897 ************************************ 00:09:28.897 07:15:01 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:09:28.897 07:15:01 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:09:28.897 07:15:01 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:28.897 07:15:01 accel -- common/autotest_common.sh@10 -- # set +x 00:09:28.897 ************************************ 00:09:28.897 START TEST accel_dif_verify 00:09:28.897 ************************************ 00:09:28.897 07:15:01 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_verify 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:28.897 07:15:01 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:09:28.897 [2024-07-25 07:15:01.166642] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:28.897 [2024-07-25 07:15:01.166700] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1553934 ] 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.897 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:28.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:28.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:28.898 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:28.898 [2024-07-25 07:15:01.299328] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.898 [2024-07-25 07:15:01.384371] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.156 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.156 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.156 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.156 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:29.157 07:15:01 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:09:30.097 07:15:02 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:30.097 00:09:30.097 real 0m1.461s 00:09:30.097 user 0m1.287s 00:09:30.097 sys 0m0.180s 00:09:30.097 07:15:02 accel.accel_dif_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:30.097 07:15:02 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:09:30.097 ************************************ 00:09:30.097 END TEST accel_dif_verify 00:09:30.097 ************************************ 00:09:30.355 07:15:02 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:09:30.355 07:15:02 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:09:30.355 07:15:02 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:30.355 07:15:02 accel -- common/autotest_common.sh@10 -- # set +x 00:09:30.355 ************************************ 00:09:30.355 START TEST accel_dif_generate 00:09:30.355 ************************************ 00:09:30.355 07:15:02 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:09:30.355 07:15:02 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:09:30.355 [2024-07-25 07:15:02.704213] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:30.355 [2024-07-25 07:15:02.704268] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554283 ] 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:30.355 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.355 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:30.356 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.356 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:30.356 [2024-07-25 07:15:02.837068] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.614 [2024-07-25 07:15:02.922248] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:30.614 07:15:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:09:31.988 07:15:04 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:31.988 00:09:31.988 real 0m1.462s 00:09:31.988 user 0m1.286s 00:09:31.988 sys 0m0.182s 00:09:31.988 07:15:04 accel.accel_dif_generate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:31.988 07:15:04 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:09:31.988 ************************************ 00:09:31.988 END TEST accel_dif_generate 00:09:31.988 ************************************ 00:09:31.988 07:15:04 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:09:31.988 07:15:04 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:09:31.988 07:15:04 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:31.988 07:15:04 accel -- common/autotest_common.sh@10 -- # set +x 00:09:31.988 ************************************ 00:09:31.989 START TEST accel_dif_generate_copy 00:09:31.989 ************************************ 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate_copy 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:09:31.989 [2024-07-25 07:15:04.235331] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:31.989 [2024-07-25 07:15:04.235385] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554689 ] 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:31.989 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:31.989 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:31.989 [2024-07-25 07:15:04.367296] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:31.989 [2024-07-25 07:15:04.451052] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.989 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:31.990 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:32.248 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:32.248 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:32.248 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:32.248 07:15:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:33.183 00:09:33.183 real 0m1.456s 00:09:33.183 user 0m1.274s 00:09:33.183 sys 0m0.185s 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:33.183 07:15:05 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:09:33.183 ************************************ 00:09:33.183 END TEST accel_dif_generate_copy 00:09:33.183 ************************************ 00:09:33.183 07:15:05 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:09:33.183 07:15:05 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.183 07:15:05 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:09:33.183 07:15:05 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:33.183 07:15:05 accel -- common/autotest_common.sh@10 -- # set +x 00:09:33.442 ************************************ 00:09:33.442 START TEST accel_comp 00:09:33.442 ************************************ 00:09:33.442 07:15:05 accel.accel_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:33.442 07:15:05 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:09:33.442 [2024-07-25 07:15:05.766935] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:33.442 [2024-07-25 07:15:05.766991] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555219 ] 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:33.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.442 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:33.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:33.443 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:33.443 [2024-07-25 07:15:05.899930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.702 [2024-07-25 07:15:05.982648] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.702 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:33.703 07:15:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:34.674 07:15:07 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:34.674 00:09:34.674 real 0m1.467s 00:09:34.674 user 0m1.291s 00:09:34.674 sys 0m0.182s 00:09:34.674 07:15:07 accel.accel_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:34.674 07:15:07 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:09:34.674 ************************************ 00:09:34.674 END TEST accel_comp 00:09:34.674 ************************************ 00:09:34.933 07:15:07 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:34.933 07:15:07 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:09:34.933 07:15:07 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:34.933 07:15:07 accel -- common/autotest_common.sh@10 -- # set +x 00:09:34.933 ************************************ 00:09:34.933 START TEST accel_decomp 00:09:34.933 ************************************ 00:09:34.933 07:15:07 accel.accel_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:34.933 07:15:07 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:34.933 [2024-07-25 07:15:07.325205] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:34.933 [2024-07-25 07:15:07.325326] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555504 ] 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:35.192 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:35.192 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:35.192 [2024-07-25 07:15:07.533020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.192 [2024-07-25 07:15:07.619210] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.192 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.192 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:35.193 07:15:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:36.570 07:15:08 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:36.570 00:09:36.570 real 0m1.560s 00:09:36.570 user 0m1.300s 00:09:36.570 sys 0m0.264s 00:09:36.570 07:15:08 accel.accel_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:36.570 07:15:08 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:36.570 ************************************ 00:09:36.570 END TEST accel_decomp 00:09:36.570 ************************************ 00:09:36.570 07:15:08 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:36.570 07:15:08 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:36.570 07:15:08 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:36.570 07:15:08 accel -- common/autotest_common.sh@10 -- # set +x 00:09:36.570 ************************************ 00:09:36.570 START TEST accel_decomp_full 00:09:36.570 ************************************ 00:09:36.570 07:15:08 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:36.570 07:15:08 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:36.570 [2024-07-25 07:15:08.948917] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:36.570 [2024-07-25 07:15:08.948970] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555815 ] 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:36.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.570 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:36.571 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:36.571 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:36.571 [2024-07-25 07:15:09.069002] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.830 [2024-07-25 07:15:09.152450] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.830 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:36.831 07:15:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:38.207 07:15:10 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:38.207 00:09:38.207 real 0m1.473s 00:09:38.207 user 0m1.301s 00:09:38.207 sys 0m0.172s 00:09:38.207 07:15:10 accel.accel_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:38.207 07:15:10 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:38.207 ************************************ 00:09:38.207 END TEST accel_decomp_full 00:09:38.207 ************************************ 00:09:38.207 07:15:10 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:38.207 07:15:10 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:38.208 07:15:10 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:38.208 07:15:10 accel -- common/autotest_common.sh@10 -- # set +x 00:09:38.208 ************************************ 00:09:38.208 START TEST accel_decomp_mcore 00:09:38.208 ************************************ 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:38.208 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:38.208 [2024-07-25 07:15:10.496899] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:38.208 [2024-07-25 07:15:10.496962] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556085 ] 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:38.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:38.208 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:38.208 [2024-07-25 07:15:10.629655] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:38.208 [2024-07-25 07:15:10.717923] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:38.208 [2024-07-25 07:15:10.718017] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:38.208 [2024-07-25 07:15:10.718311] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:38.208 [2024-07-25 07:15:10.718317] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:38.467 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:38.468 07:15:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.403 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:39.662 00:09:39.662 real 0m1.477s 00:09:39.662 user 0m4.691s 00:09:39.662 sys 0m0.183s 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:39.662 07:15:11 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:39.662 ************************************ 00:09:39.662 END TEST accel_decomp_mcore 00:09:39.662 ************************************ 00:09:39.662 07:15:11 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:39.662 07:15:11 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:09:39.662 07:15:11 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:39.662 07:15:11 accel -- common/autotest_common.sh@10 -- # set +x 00:09:39.662 ************************************ 00:09:39.662 START TEST accel_decomp_full_mcore 00:09:39.662 ************************************ 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:39.662 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:39.662 [2024-07-25 07:15:12.025529] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:39.662 [2024-07-25 07:15:12.025581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556363 ] 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:39.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:39.663 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:39.663 [2024-07-25 07:15:12.156253] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:39.922 [2024-07-25 07:15:12.243263] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:39.922 [2024-07-25 07:15:12.243376] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:39.922 [2024-07-25 07:15:12.243470] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:39.922 [2024-07-25 07:15:12.243474] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.922 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:39.923 07:15:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.301 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:41.302 00:09:41.302 real 0m1.471s 00:09:41.302 user 0m4.717s 00:09:41.302 sys 0m0.184s 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:41.302 07:15:13 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:41.302 ************************************ 00:09:41.302 END TEST accel_decomp_full_mcore 00:09:41.302 ************************************ 00:09:41.302 07:15:13 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:41.302 07:15:13 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:41.302 07:15:13 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:41.302 07:15:13 accel -- common/autotest_common.sh@10 -- # set +x 00:09:41.302 ************************************ 00:09:41.302 START TEST accel_decomp_mthread 00:09:41.302 ************************************ 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:41.302 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:41.302 [2024-07-25 07:15:13.585819] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:41.302 [2024-07-25 07:15:13.585877] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556631 ] 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:41.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:41.302 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:41.302 [2024-07-25 07:15:13.718992] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:41.302 [2024-07-25 07:15:13.801895] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:41.562 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:41.563 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:41.563 07:15:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:42.499 00:09:42.499 real 0m1.472s 00:09:42.499 user 0m1.296s 00:09:42.499 sys 0m0.182s 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:42.499 07:15:15 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:42.499 ************************************ 00:09:42.499 END TEST accel_decomp_mthread 00:09:42.499 ************************************ 00:09:42.758 07:15:15 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:42.758 07:15:15 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:09:42.758 07:15:15 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:42.758 07:15:15 accel -- common/autotest_common.sh@10 -- # set +x 00:09:42.758 ************************************ 00:09:42.758 START TEST accel_decomp_full_mthread 00:09:42.758 ************************************ 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:42.758 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:42.758 [2024-07-25 07:15:15.133339] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:42.759 [2024-07-25 07:15:15.133395] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556909 ] 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:42.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:42.759 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:42.759 [2024-07-25 07:15:15.264927] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:43.018 [2024-07-25 07:15:15.348088] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:43.018 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:43.019 07:15:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:44.395 00:09:44.395 real 0m1.502s 00:09:44.395 user 0m1.319s 00:09:44.395 sys 0m0.188s 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:44.395 07:15:16 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:44.395 ************************************ 00:09:44.395 END TEST accel_decomp_full_mthread 00:09:44.395 ************************************ 00:09:44.395 07:15:16 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:09:44.395 07:15:16 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:09:44.395 07:15:16 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:09:44.395 07:15:16 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:44.395 07:15:16 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1557171 00:09:44.395 07:15:16 accel -- accel/accel.sh@63 -- # waitforlisten 1557171 00:09:44.395 07:15:16 accel -- common/autotest_common.sh@831 -- # '[' -z 1557171 ']' 00:09:44.395 07:15:16 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:44.395 07:15:16 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:44.395 07:15:16 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:44.395 07:15:16 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:44.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:44.396 07:15:16 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:44.396 07:15:16 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:44.396 07:15:16 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:44.396 07:15:16 accel -- common/autotest_common.sh@10 -- # set +x 00:09:44.396 07:15:16 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:44.396 07:15:16 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:44.396 07:15:16 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:44.396 07:15:16 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:44.396 07:15:16 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:44.396 07:15:16 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:44.396 07:15:16 accel -- accel/accel.sh@41 -- # jq -r . 00:09:44.396 [2024-07-25 07:15:16.705374] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:44.396 [2024-07-25 07:15:16.705438] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557171 ] 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:44.396 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:44.396 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:44.396 [2024-07-25 07:15:16.837583] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:44.396 [2024-07-25 07:15:16.923042] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.331 [2024-07-25 07:15:17.619733] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:45.331 07:15:17 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:45.331 07:15:17 accel -- common/autotest_common.sh@864 -- # return 0 00:09:45.331 07:15:17 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:45.331 07:15:17 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:45.331 07:15:17 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:45.331 07:15:17 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:09:45.331 07:15:17 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:09:45.331 07:15:17 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:09:45.331 07:15:17 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:45.331 07:15:17 accel -- common/autotest_common.sh@10 -- # set +x 00:09:45.331 07:15:17 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:09:45.331 07:15:17 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:09:45.590 07:15:17 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:45.590 "method": "compressdev_scan_accel_module", 00:09:45.590 07:15:17 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:45.590 07:15:17 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:45.590 07:15:17 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:45.590 07:15:17 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:45.590 07:15:17 accel -- common/autotest_common.sh@10 -- # set +x 00:09:45.590 07:15:17 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:45.590 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.590 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.590 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.590 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.590 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.590 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.590 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.590 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.590 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.590 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:45.591 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:45.591 07:15:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:17 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # IFS== 00:09:45.591 07:15:18 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:45.591 07:15:18 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:45.591 07:15:18 accel -- accel/accel.sh@75 -- # killprocess 1557171 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@950 -- # '[' -z 1557171 ']' 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@954 -- # kill -0 1557171 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@955 -- # uname 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1557171 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1557171' 00:09:45.591 killing process with pid 1557171 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@969 -- # kill 1557171 00:09:45.591 07:15:18 accel -- common/autotest_common.sh@974 -- # wait 1557171 00:09:46.158 07:15:18 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:46.158 07:15:18 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:46.158 07:15:18 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:09:46.158 07:15:18 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:46.158 07:15:18 accel -- common/autotest_common.sh@10 -- # set +x 00:09:46.158 ************************************ 00:09:46.158 START TEST accel_cdev_comp 00:09:46.159 ************************************ 00:09:46.159 07:15:18 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:46.159 07:15:18 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:09:46.159 [2024-07-25 07:15:18.477689] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:46.159 [2024-07-25 07:15:18.477747] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557578 ] 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:46.159 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:46.159 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:46.159 [2024-07-25 07:15:18.608156] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.159 [2024-07-25 07:15:18.690714] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:47.095 [2024-07-25 07:15:19.378654] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:47.095 [2024-07-25 07:15:19.381047] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fdd000 PMD being used: compress_qat 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 [2024-07-25 07:15:19.384843] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fe1e00 PMD being used: compress_qat 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:47.095 07:15:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:48.066 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:48.067 07:15:20 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:48.067 00:09:48.067 real 0m2.098s 00:09:48.067 user 0m1.578s 00:09:48.067 sys 0m0.520s 00:09:48.067 07:15:20 accel.accel_cdev_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:48.067 07:15:20 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:09:48.067 ************************************ 00:09:48.067 END TEST accel_cdev_comp 00:09:48.067 ************************************ 00:09:48.067 07:15:20 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:48.067 07:15:20 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:09:48.067 07:15:20 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:48.067 07:15:20 accel -- common/autotest_common.sh@10 -- # set +x 00:09:48.325 ************************************ 00:09:48.325 START TEST accel_cdev_decomp 00:09:48.325 ************************************ 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:48.325 07:15:20 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:48.325 [2024-07-25 07:15:20.654174] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:48.325 [2024-07-25 07:15:20.654231] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557923 ] 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.325 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:48.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.326 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:48.326 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:48.326 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:48.326 [2024-07-25 07:15:20.788313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.583 [2024-07-25 07:15:20.871417] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:49.149 [2024-07-25 07:15:21.553916] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:49.149 [2024-07-25 07:15:21.556293] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13e1000 PMD being used: compress_qat 00:09:49.149 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.149 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.149 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 [2024-07-25 07:15:21.560220] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13e5e00 PMD being used: compress_qat 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:49.150 07:15:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:50.524 00:09:50.524 real 0m2.093s 00:09:50.524 user 0m1.552s 00:09:50.524 sys 0m0.544s 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:50.524 07:15:22 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:50.524 ************************************ 00:09:50.524 END TEST accel_cdev_decomp 00:09:50.524 ************************************ 00:09:50.524 07:15:22 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:50.524 07:15:22 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:50.524 07:15:22 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:50.524 07:15:22 accel -- common/autotest_common.sh@10 -- # set +x 00:09:50.524 ************************************ 00:09:50.524 START TEST accel_cdev_decomp_full 00:09:50.524 ************************************ 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:50.524 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:50.525 07:15:22 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:50.525 [2024-07-25 07:15:22.826055] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:50.525 [2024-07-25 07:15:22.826108] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558284 ] 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:50.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:50.525 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:50.525 [2024-07-25 07:15:22.956713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:50.525 [2024-07-25 07:15:23.039082] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:51.458 [2024-07-25 07:15:23.730874] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:51.458 [2024-07-25 07:15:23.733338] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe91000 PMD being used: compress_qat 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.458 [2024-07-25 07:15:23.736293] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe943a0 PMD being used: compress_qat 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.458 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:51.459 07:15:23 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:52.395 00:09:52.395 real 0m2.100s 00:09:52.395 user 0m1.573s 00:09:52.395 sys 0m0.522s 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:52.395 07:15:24 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:52.395 ************************************ 00:09:52.395 END TEST accel_cdev_decomp_full 00:09:52.395 ************************************ 00:09:52.654 07:15:24 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:52.654 07:15:24 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:52.654 07:15:24 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:52.654 07:15:24 accel -- common/autotest_common.sh@10 -- # set +x 00:09:52.654 ************************************ 00:09:52.654 START TEST accel_cdev_decomp_mcore 00:09:52.654 ************************************ 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:52.654 07:15:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:52.654 [2024-07-25 07:15:25.003998] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:52.654 [2024-07-25 07:15:25.004051] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558745 ] 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.654 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:52.654 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:52.655 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:52.655 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:52.655 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:52.655 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:52.655 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:52.655 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.655 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:52.655 [2024-07-25 07:15:25.135349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:52.913 [2024-07-25 07:15:25.222997] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:52.913 [2024-07-25 07:15:25.223092] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:52.913 [2024-07-25 07:15:25.223216] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:52.913 [2024-07-25 07:15:25.223221] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:53.481 [2024-07-25 07:15:25.903736] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:53.481 [2024-07-25 07:15:25.906123] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ca96b0 PMD being used: compress_qat 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 [2024-07-25 07:15:25.911306] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f59ac19b8b0 PMD being used: compress_qat 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:53.481 [2024-07-25 07:15:25.911851] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f59a419b8b0 PMD being used: compress_qat 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 [2024-07-25 07:15:25.912770] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1caea10 PMD being used: compress_qat 00:09:53.481 [2024-07-25 07:15:25.913044] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f599c19b8b0 PMD being used: compress_qat 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.481 07:15:25 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.931 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:54.932 00:09:54.932 real 0m2.109s 00:09:54.932 user 0m6.878s 00:09:54.932 sys 0m0.532s 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:54.932 07:15:27 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:54.932 ************************************ 00:09:54.932 END TEST accel_cdev_decomp_mcore 00:09:54.932 ************************************ 00:09:54.932 07:15:27 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:54.932 07:15:27 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:09:54.932 07:15:27 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:54.932 07:15:27 accel -- common/autotest_common.sh@10 -- # set +x 00:09:54.932 ************************************ 00:09:54.932 START TEST accel_cdev_decomp_full_mcore 00:09:54.932 ************************************ 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:54.932 07:15:27 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:54.932 [2024-07-25 07:15:27.191044] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:54.932 [2024-07-25 07:15:27.191097] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559042 ] 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:54.932 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:54.932 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:54.932 [2024-07-25 07:15:27.323342] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:54.932 [2024-07-25 07:15:27.410435] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:54.932 [2024-07-25 07:15:27.410530] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:54.932 [2024-07-25 07:15:27.410634] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:54.932 [2024-07-25 07:15:27.410637] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.867 [2024-07-25 07:15:28.089084] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:55.867 [2024-07-25 07:15:28.091452] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15c36b0 PMD being used: compress_qat 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 [2024-07-25 07:15:28.095722] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f837c19b8b0 PMD being used: compress_qat 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:55.867 [2024-07-25 07:15:28.096251] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f837419b8b0 PMD being used: compress_qat 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.867 [2024-07-25 07:15:28.097144] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15c3750 PMD being used: compress_qat 00:09:55.867 [2024-07-25 07:15:28.097488] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f836c19b8b0 PMD being used: compress_qat 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:55.867 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:55.868 07:15:28 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.804 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:56.805 00:09:56.805 real 0m2.108s 00:09:56.805 user 0m6.880s 00:09:56.805 sys 0m0.522s 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:56.805 07:15:29 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:56.805 ************************************ 00:09:56.805 END TEST accel_cdev_decomp_full_mcore 00:09:56.805 ************************************ 00:09:56.805 07:15:29 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:56.805 07:15:29 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:56.805 07:15:29 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:56.805 07:15:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:57.064 ************************************ 00:09:57.064 START TEST accel_cdev_decomp_mthread 00:09:57.064 ************************************ 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:57.064 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:57.065 07:15:29 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:57.065 [2024-07-25 07:15:29.382115] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:57.065 [2024-07-25 07:15:29.382179] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559522 ] 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:57.065 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:57.065 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:57.065 [2024-07-25 07:15:29.512013] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:57.065 [2024-07-25 07:15:29.594843] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.001 [2024-07-25 07:15:30.277570] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:58.001 [2024-07-25 07:15:30.279967] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dcf000 PMD being used: compress_qat 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.001 [2024-07-25 07:15:30.284550] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dd4200 PMD being used: compress_qat 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.001 [2024-07-25 07:15:30.286878] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ef6ff0 PMD being used: compress_qat 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.001 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.002 07:15:30 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:58.937 00:09:58.937 real 0m2.096s 00:09:58.937 user 0m1.557s 00:09:58.937 sys 0m0.545s 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:58.937 07:15:31 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:58.937 ************************************ 00:09:58.937 END TEST accel_cdev_decomp_mthread 00:09:58.937 ************************************ 00:09:59.196 07:15:31 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:59.196 07:15:31 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:09:59.196 07:15:31 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:59.196 07:15:31 accel -- common/autotest_common.sh@10 -- # set +x 00:09:59.196 ************************************ 00:09:59.196 START TEST accel_cdev_decomp_full_mthread 00:09:59.196 ************************************ 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:59.196 07:15:31 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:59.196 [2024-07-25 07:15:31.557743] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:09:59.196 [2024-07-25 07:15:31.557796] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559871 ] 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:59.196 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.196 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:59.197 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:59.197 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:59.197 [2024-07-25 07:15:31.687243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:59.455 [2024-07-25 07:15:31.769852] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.023 [2024-07-25 07:15:32.460576] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:00.023 [2024-07-25 07:15:32.462934] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e22000 PMD being used: compress_qat 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 [2024-07-25 07:15:32.466731] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e253a0 PMD being used: compress_qat 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 [2024-07-25 07:15:32.469282] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f49c00 PMD being used: compress_qat 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:00.023 07:15:32 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:01.401 00:10:01.401 real 0m2.105s 00:10:01.401 user 0m1.576s 00:10:01.401 sys 0m0.526s 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:01.401 07:15:33 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:10:01.401 ************************************ 00:10:01.401 END TEST accel_cdev_decomp_full_mthread 00:10:01.401 ************************************ 00:10:01.401 07:15:33 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:10:01.401 07:15:33 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:10:01.401 07:15:33 accel -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:10:01.401 07:15:33 accel -- accel/accel.sh@137 -- # build_accel_config 00:10:01.401 07:15:33 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:01.401 07:15:33 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:01.401 07:15:33 accel -- common/autotest_common.sh@10 -- # set +x 00:10:01.401 07:15:33 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:01.401 07:15:33 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:01.401 07:15:33 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:01.401 07:15:33 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:01.401 07:15:33 accel -- accel/accel.sh@40 -- # local IFS=, 00:10:01.401 07:15:33 accel -- accel/accel.sh@41 -- # jq -r . 00:10:01.401 ************************************ 00:10:01.401 START TEST accel_dif_functional_tests 00:10:01.401 ************************************ 00:10:01.401 07:15:33 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:10:01.401 [2024-07-25 07:15:33.767361] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:01.402 [2024-07-25 07:15:33.767415] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560178 ] 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:01.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:01.402 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:01.402 [2024-07-25 07:15:33.899981] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:10:01.661 [2024-07-25 07:15:33.985621] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:10:01.661 [2024-07-25 07:15:33.985715] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:10:01.661 [2024-07-25 07:15:33.985719] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:01.661 00:10:01.661 00:10:01.661 CUnit - A unit testing framework for C - Version 2.1-3 00:10:01.661 http://cunit.sourceforge.net/ 00:10:01.661 00:10:01.661 00:10:01.661 Suite: accel_dif 00:10:01.661 Test: verify: DIF generated, GUARD check ...passed 00:10:01.661 Test: verify: DIF generated, APPTAG check ...passed 00:10:01.661 Test: verify: DIF generated, REFTAG check ...passed 00:10:01.661 Test: verify: DIF not generated, GUARD check ...[2024-07-25 07:15:34.072118] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:10:01.661 passed 00:10:01.661 Test: verify: DIF not generated, APPTAG check ...[2024-07-25 07:15:34.072199] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:10:01.661 passed 00:10:01.661 Test: verify: DIF not generated, REFTAG check ...[2024-07-25 07:15:34.072231] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:10:01.661 passed 00:10:01.661 Test: verify: APPTAG correct, APPTAG check ...passed 00:10:01.661 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-25 07:15:34.072295] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:10:01.661 passed 00:10:01.661 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:10:01.661 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:10:01.661 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:10:01.661 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-25 07:15:34.072437] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:10:01.661 passed 00:10:01.661 Test: verify copy: DIF generated, GUARD check ...passed 00:10:01.661 Test: verify copy: DIF generated, APPTAG check ...passed 00:10:01.661 Test: verify copy: DIF generated, REFTAG check ...passed 00:10:01.661 Test: verify copy: DIF not generated, GUARD check ...[2024-07-25 07:15:34.072593] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:10:01.661 passed 00:10:01.661 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-25 07:15:34.072628] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:10:01.661 passed 00:10:01.661 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-25 07:15:34.072658] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:10:01.661 passed 00:10:01.661 Test: generate copy: DIF generated, GUARD check ...passed 00:10:01.661 Test: generate copy: DIF generated, APTTAG check ...passed 00:10:01.661 Test: generate copy: DIF generated, REFTAG check ...passed 00:10:01.661 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:10:01.661 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:10:01.661 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:10:01.661 Test: generate copy: iovecs-len validate ...[2024-07-25 07:15:34.072890] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:10:01.661 passed 00:10:01.661 Test: generate copy: buffer alignment validate ...passed 00:10:01.661 00:10:01.661 Run Summary: Type Total Ran Passed Failed Inactive 00:10:01.661 suites 1 1 n/a 0 0 00:10:01.661 tests 26 26 26 0 0 00:10:01.661 asserts 115 115 115 0 n/a 00:10:01.661 00:10:01.661 Elapsed time = 0.002 seconds 00:10:01.920 00:10:01.920 real 0m0.549s 00:10:01.920 user 0m0.719s 00:10:01.920 sys 0m0.221s 00:10:01.920 07:15:34 accel.accel_dif_functional_tests -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:01.920 07:15:34 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:10:01.920 ************************************ 00:10:01.920 END TEST accel_dif_functional_tests 00:10:01.920 ************************************ 00:10:01.920 00:10:01.920 real 0m51.641s 00:10:01.920 user 0m59.699s 00:10:01.920 sys 0m11.391s 00:10:01.920 07:15:34 accel -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:01.920 07:15:34 accel -- common/autotest_common.sh@10 -- # set +x 00:10:01.920 ************************************ 00:10:01.920 END TEST accel 00:10:01.920 ************************************ 00:10:01.920 07:15:34 -- spdk/autotest.sh@186 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:10:01.920 07:15:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:01.920 07:15:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:01.920 07:15:34 -- common/autotest_common.sh@10 -- # set +x 00:10:01.920 ************************************ 00:10:01.920 START TEST accel_rpc 00:10:01.920 ************************************ 00:10:01.920 07:15:34 accel_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:10:02.179 * Looking for test storage... 00:10:02.179 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:10:02.179 07:15:34 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:10:02.179 07:15:34 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1560477 00:10:02.179 07:15:34 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1560477 00:10:02.179 07:15:34 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:10:02.179 07:15:34 accel_rpc -- common/autotest_common.sh@831 -- # '[' -z 1560477 ']' 00:10:02.179 07:15:34 accel_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:02.179 07:15:34 accel_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:02.179 07:15:34 accel_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:02.179 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:02.179 07:15:34 accel_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:02.179 07:15:34 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:02.179 [2024-07-25 07:15:34.547732] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:02.179 [2024-07-25 07:15:34.547799] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560477 ] 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:02.179 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.179 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:02.180 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:02.180 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:02.180 [2024-07-25 07:15:34.681185] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:02.438 [2024-07-25 07:15:34.765545] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:03.004 07:15:35 accel_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:03.004 07:15:35 accel_rpc -- common/autotest_common.sh@864 -- # return 0 00:10:03.004 07:15:35 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:10:03.004 07:15:35 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:10:03.004 07:15:35 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:10:03.004 07:15:35 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:10:03.004 07:15:35 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:10:03.004 07:15:35 accel_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:03.004 07:15:35 accel_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:03.004 07:15:35 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:03.004 ************************************ 00:10:03.004 START TEST accel_assign_opcode 00:10:03.004 ************************************ 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # accel_assign_opcode_test_suite 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:03.004 [2024-07-25 07:15:35.415593] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:03.004 [2024-07-25 07:15:35.423608] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.004 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.263 software 00:10:03.263 00:10:03.263 real 0m0.288s 00:10:03.263 user 0m0.048s 00:10:03.263 sys 0m0.015s 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:03.263 07:15:35 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:03.263 ************************************ 00:10:03.263 END TEST accel_assign_opcode 00:10:03.263 ************************************ 00:10:03.263 07:15:35 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1560477 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@950 -- # '[' -z 1560477 ']' 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@954 -- # kill -0 1560477 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@955 -- # uname 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1560477 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1560477' 00:10:03.263 killing process with pid 1560477 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@969 -- # kill 1560477 00:10:03.263 07:15:35 accel_rpc -- common/autotest_common.sh@974 -- # wait 1560477 00:10:03.830 00:10:03.830 real 0m1.730s 00:10:03.830 user 0m1.709s 00:10:03.830 sys 0m0.561s 00:10:03.830 07:15:36 accel_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:03.830 07:15:36 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:03.830 ************************************ 00:10:03.830 END TEST accel_rpc 00:10:03.830 ************************************ 00:10:03.830 07:15:36 -- spdk/autotest.sh@189 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:10:03.830 07:15:36 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:03.830 07:15:36 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:03.830 07:15:36 -- common/autotest_common.sh@10 -- # set +x 00:10:03.830 ************************************ 00:10:03.830 START TEST app_cmdline 00:10:03.830 ************************************ 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:10:03.830 * Looking for test storage... 00:10:03.830 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:10:03.830 07:15:36 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:10:03.830 07:15:36 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1560813 00:10:03.830 07:15:36 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1560813 00:10:03.830 07:15:36 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 1560813 ']' 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:03.830 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:03.830 07:15:36 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:03.830 [2024-07-25 07:15:36.360696] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:03.830 [2024-07-25 07:15:36.360758] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560813 ] 00:10:04.088 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.088 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:04.088 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.088 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:04.088 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:04.089 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.089 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:04.089 [2024-07-25 07:15:36.494320] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:04.089 [2024-07-25 07:15:36.577643] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.022 07:15:37 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:05.022 07:15:37 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:10:05.022 { 00:10:05.022 "version": "SPDK v24.09-pre git sha1 e5ef9abc9", 00:10:05.022 "fields": { 00:10:05.022 "major": 24, 00:10:05.022 "minor": 9, 00:10:05.022 "patch": 0, 00:10:05.022 "suffix": "-pre", 00:10:05.022 "commit": "e5ef9abc9" 00:10:05.022 } 00:10:05.022 } 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:10:05.022 07:15:37 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:05.023 07:15:37 app_cmdline -- app/cmdline.sh@26 -- # sort 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:05.023 07:15:37 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:10:05.023 07:15:37 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:10:05.023 07:15:37 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:05.023 07:15:37 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:05.281 request: 00:10:05.281 { 00:10:05.281 "method": "env_dpdk_get_mem_stats", 00:10:05.281 "req_id": 1 00:10:05.281 } 00:10:05.281 Got JSON-RPC error response 00:10:05.281 response: 00:10:05.281 { 00:10:05.281 "code": -32601, 00:10:05.281 "message": "Method not found" 00:10:05.281 } 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:05.281 07:15:37 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1560813 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 1560813 ']' 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 1560813 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1560813 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1560813' 00:10:05.281 killing process with pid 1560813 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@969 -- # kill 1560813 00:10:05.281 07:15:37 app_cmdline -- common/autotest_common.sh@974 -- # wait 1560813 00:10:05.848 00:10:05.848 real 0m1.950s 00:10:05.848 user 0m2.316s 00:10:05.848 sys 0m0.605s 00:10:05.848 07:15:38 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:05.848 07:15:38 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:05.848 ************************************ 00:10:05.848 END TEST app_cmdline 00:10:05.848 ************************************ 00:10:05.848 07:15:38 -- spdk/autotest.sh@190 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:10:05.848 07:15:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:05.848 07:15:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:05.848 07:15:38 -- common/autotest_common.sh@10 -- # set +x 00:10:05.848 ************************************ 00:10:05.848 START TEST version 00:10:05.848 ************************************ 00:10:05.848 07:15:38 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:10:05.848 * Looking for test storage... 00:10:05.848 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:10:05.848 07:15:38 version -- app/version.sh@17 -- # get_header_version major 00:10:05.848 07:15:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:05.848 07:15:38 version -- app/version.sh@14 -- # cut -f2 00:10:05.848 07:15:38 version -- app/version.sh@14 -- # tr -d '"' 00:10:05.848 07:15:38 version -- app/version.sh@17 -- # major=24 00:10:05.848 07:15:38 version -- app/version.sh@18 -- # get_header_version minor 00:10:05.848 07:15:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:05.848 07:15:38 version -- app/version.sh@14 -- # cut -f2 00:10:05.848 07:15:38 version -- app/version.sh@14 -- # tr -d '"' 00:10:05.848 07:15:38 version -- app/version.sh@18 -- # minor=9 00:10:05.848 07:15:38 version -- app/version.sh@19 -- # get_header_version patch 00:10:05.848 07:15:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:05.848 07:15:38 version -- app/version.sh@14 -- # cut -f2 00:10:05.848 07:15:38 version -- app/version.sh@14 -- # tr -d '"' 00:10:05.848 07:15:38 version -- app/version.sh@19 -- # patch=0 00:10:05.848 07:15:38 version -- app/version.sh@20 -- # get_header_version suffix 00:10:05.849 07:15:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:05.849 07:15:38 version -- app/version.sh@14 -- # cut -f2 00:10:05.849 07:15:38 version -- app/version.sh@14 -- # tr -d '"' 00:10:05.849 07:15:38 version -- app/version.sh@20 -- # suffix=-pre 00:10:05.849 07:15:38 version -- app/version.sh@22 -- # version=24.9 00:10:05.849 07:15:38 version -- app/version.sh@25 -- # (( patch != 0 )) 00:10:05.849 07:15:38 version -- app/version.sh@28 -- # version=24.9rc0 00:10:05.849 07:15:38 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:10:05.849 07:15:38 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:10:06.108 07:15:38 version -- app/version.sh@30 -- # py_version=24.9rc0 00:10:06.108 07:15:38 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:10:06.108 00:10:06.108 real 0m0.189s 00:10:06.108 user 0m0.084s 00:10:06.108 sys 0m0.151s 00:10:06.108 07:15:38 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:06.108 07:15:38 version -- common/autotest_common.sh@10 -- # set +x 00:10:06.108 ************************************ 00:10:06.108 END TEST version 00:10:06.108 ************************************ 00:10:06.108 07:15:38 -- spdk/autotest.sh@192 -- # '[' 1 -eq 1 ']' 00:10:06.108 07:15:38 -- spdk/autotest.sh@193 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:10:06.108 07:15:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:06.108 07:15:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:06.108 07:15:38 -- common/autotest_common.sh@10 -- # set +x 00:10:06.108 ************************************ 00:10:06.108 START TEST blockdev_general 00:10:06.108 ************************************ 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:10:06.108 * Looking for test storage... 00:10:06.108 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:06.108 07:15:38 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1561334 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:10:06.108 07:15:38 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1561334 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@831 -- # '[' -z 1561334 ']' 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:06.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:06.108 07:15:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:06.368 [2024-07-25 07:15:38.681573] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:06.368 [2024-07-25 07:15:38.681636] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561334 ] 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:06.368 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.368 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:06.368 [2024-07-25 07:15:38.812613] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:06.368 [2024-07-25 07:15:38.900339] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.305 07:15:39 blockdev_general -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:07.305 07:15:39 blockdev_general -- common/autotest_common.sh@864 -- # return 0 00:10:07.305 07:15:39 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:10:07.305 07:15:39 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:10:07.305 07:15:39 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:10:07.305 07:15:39 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.305 07:15:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.306 [2024-07-25 07:15:39.806182] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:07.306 [2024-07-25 07:15:39.806238] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:07.306 00:10:07.306 [2024-07-25 07:15:39.814168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:07.306 [2024-07-25 07:15:39.814191] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:07.306 00:10:07.306 Malloc0 00:10:07.564 Malloc1 00:10:07.564 Malloc2 00:10:07.564 Malloc3 00:10:07.564 Malloc4 00:10:07.564 Malloc5 00:10:07.564 Malloc6 00:10:07.564 Malloc7 00:10:07.564 Malloc8 00:10:07.564 Malloc9 00:10:07.564 [2024-07-25 07:15:39.948020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:07.564 [2024-07-25 07:15:39.948067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:07.564 [2024-07-25 07:15:39.948084] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9a8a0 00:10:07.564 [2024-07-25 07:15:39.948096] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:07.564 [2024-07-25 07:15:39.949331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:07.564 [2024-07-25 07:15:39.949359] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:07.564 TestPT 00:10:07.564 07:15:39 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.564 07:15:39 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:10:07.564 5000+0 records in 00:10:07.564 5000+0 records out 00:10:07.564 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0371753 s, 275 MB/s 00:10:07.565 07:15:40 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.565 AIO0 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.565 07:15:40 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.565 07:15:40 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:10:07.565 07:15:40 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.565 07:15:40 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.565 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.824 07:15:40 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.824 07:15:40 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:10:07.824 07:15:40 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:10:07.824 07:15:40 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.824 07:15:40 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:07.824 07:15:40 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:10:07.824 07:15:40 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:10:07.826 07:15:40 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ed3ac8fc-f7f7-48ca-ac63-999d8023b77c"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ed3ac8fc-f7f7-48ca-ac63-999d8023b77c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "1a93c10b-5432-5a78-957f-77010344738e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "1a93c10b-5432-5a78-957f-77010344738e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "b12a7de3-4bf0-5deb-ba42-dc966f2eedba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b12a7de3-4bf0-5deb-ba42-dc966f2eedba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "3562a809-2794-5941-bbaa-30d087ad9865"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3562a809-2794-5941-bbaa-30d087ad9865",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "37067070-100d-577c-ac50-7ca2f2be8d90"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "37067070-100d-577c-ac50-7ca2f2be8d90",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "5301bb85-d605-5559-802f-5723acc73aa4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5301bb85-d605-5559-802f-5723acc73aa4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "a091c216-71c7-5418-b3d3-21203eafaa11"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a091c216-71c7-5418-b3d3-21203eafaa11",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "dc1cc375-f4d2-53a6-8c7a-6458066d8d83"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "dc1cc375-f4d2-53a6-8c7a-6458066d8d83",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "a9e542f7-0a6a-5502-a926-602c78668cb4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a9e542f7-0a6a-5502-a926-602c78668cb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c1a6413b-bc92-5f30-b69e-f46122fadc9d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c1a6413b-bc92-5f30-b69e-f46122fadc9d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "1078a708-cda8-5308-92d4-0dbb0b48248f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1078a708-cda8-5308-92d4-0dbb0b48248f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "3b926705-49d8-5fb4-b003-368daab27267"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3b926705-49d8-5fb4-b003-368daab27267",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "945bcb1b-b0b0-440c-a11b-c27dff298311"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "945bcb1b-b0b0-440c-a11b-c27dff298311",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "945bcb1b-b0b0-440c-a11b-c27dff298311",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ad15b13a-fc18-4c61-b47d-3bc9c3285e6e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "8a91ebcd-167d-410e-a998-3777fef8bd66",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "386496ec-cb13-444c-a4af-a1669d4b7554"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "386496ec-cb13-444c-a4af-a1669d4b7554",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "386496ec-cb13-444c-a4af-a1669d4b7554",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "6202f513-d7cf-406a-8219-930d1538ce50",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "d415d0b3-7c73-478f-8c4c-2edf39e6de3a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "aa47f465-f753-4eb2-9aba-656e55c06000",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "f70f4e42-ee8b-411f-80e8-bc604309b45f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "239b5705-83ee-4f12-9173-ff9d84d7aea7"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "239b5705-83ee-4f12-9173-ff9d84d7aea7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:08.085 07:15:40 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:10:08.085 07:15:40 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:10:08.085 07:15:40 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:10:08.085 07:15:40 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 1561334 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@950 -- # '[' -z 1561334 ']' 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@954 -- # kill -0 1561334 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@955 -- # uname 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1561334 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1561334' 00:10:08.085 killing process with pid 1561334 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@969 -- # kill 1561334 00:10:08.085 07:15:40 blockdev_general -- common/autotest_common.sh@974 -- # wait 1561334 00:10:08.343 07:15:40 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:08.343 07:15:40 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:10:08.343 07:15:40 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:08.343 07:15:40 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:08.343 07:15:40 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:08.602 ************************************ 00:10:08.602 START TEST bdev_hello_world 00:10:08.602 ************************************ 00:10:08.602 07:15:40 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:10:08.602 [2024-07-25 07:15:40.995726] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:08.602 [2024-07-25 07:15:40.995846] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561754 ] 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.861 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:08.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:08.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.862 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:08.862 [2024-07-25 07:15:41.202998] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.862 [2024-07-25 07:15:41.291387] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:09.120 [2024-07-25 07:15:41.437531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.120 [2024-07-25 07:15:41.437584] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:09.120 [2024-07-25 07:15:41.437598] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:09.120 [2024-07-25 07:15:41.445540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.120 [2024-07-25 07:15:41.445565] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.120 [2024-07-25 07:15:41.453549] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:09.121 [2024-07-25 07:15:41.453571] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:09.121 [2024-07-25 07:15:41.524972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.121 [2024-07-25 07:15:41.525018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:09.121 [2024-07-25 07:15:41.525034] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1741230 00:10:09.121 [2024-07-25 07:15:41.525045] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:09.121 [2024-07-25 07:15:41.526330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:09.121 [2024-07-25 07:15:41.526357] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:09.379 [2024-07-25 07:15:41.673591] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:10:09.379 [2024-07-25 07:15:41.673659] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:10:09.380 [2024-07-25 07:15:41.673712] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:10:09.380 [2024-07-25 07:15:41.673785] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:10:09.380 [2024-07-25 07:15:41.673860] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:10:09.380 [2024-07-25 07:15:41.673891] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:10:09.380 [2024-07-25 07:15:41.673953] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:10:09.380 00:10:09.380 [2024-07-25 07:15:41.673996] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:10:09.639 00:10:09.639 real 0m1.050s 00:10:09.639 user 0m0.644s 00:10:09.639 sys 0m0.361s 00:10:09.639 07:15:41 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:09.639 07:15:41 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:10:09.639 ************************************ 00:10:09.639 END TEST bdev_hello_world 00:10:09.639 ************************************ 00:10:09.639 07:15:41 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:10:09.639 07:15:41 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:09.639 07:15:41 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:09.639 07:15:41 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:09.639 ************************************ 00:10:09.639 START TEST bdev_bounds 00:10:09.639 ************************************ 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1562031 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1562031' 00:10:09.639 Process bdevio pid: 1562031 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1562031 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1562031 ']' 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:09.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:09.639 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:10:09.639 [2024-07-25 07:15:42.066082] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:09.639 [2024-07-25 07:15:42.066136] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562031 ] 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:09.639 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.639 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:09.640 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.640 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:09.640 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.640 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:09.640 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.640 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:09.898 [2024-07-25 07:15:42.197887] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:10:09.898 [2024-07-25 07:15:42.285463] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:10:09.898 [2024-07-25 07:15:42.285557] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:10:09.898 [2024-07-25 07:15:42.285561] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:09.898 [2024-07-25 07:15:42.424997] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.898 [2024-07-25 07:15:42.425047] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:09.898 [2024-07-25 07:15:42.425059] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:09.898 [2024-07-25 07:15:42.433000] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.898 [2024-07-25 07:15:42.433025] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:10.157 [2024-07-25 07:15:42.441017] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:10.157 [2024-07-25 07:15:42.441039] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:10.157 [2024-07-25 07:15:42.512403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:10.157 [2024-07-25 07:15:42.512450] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:10.157 [2024-07-25 07:15:42.512466] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x200dd00 00:10:10.157 [2024-07-25 07:15:42.512477] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:10.157 [2024-07-25 07:15:42.513801] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:10.157 [2024-07-25 07:15:42.513828] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:10.724 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:10.724 07:15:42 blockdev_general.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:10:10.724 07:15:42 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:10:10.724 I/O targets: 00:10:10.724 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:10:10.724 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:10:10.724 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:10:10.724 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:10:10.724 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:10:10.724 raid0: 131072 blocks of 512 bytes (64 MiB) 00:10:10.724 concat0: 131072 blocks of 512 bytes (64 MiB) 00:10:10.724 raid1: 65536 blocks of 512 bytes (32 MiB) 00:10:10.724 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:10:10.724 00:10:10.724 00:10:10.724 CUnit - A unit testing framework for C - Version 2.1-3 00:10:10.724 http://cunit.sourceforge.net/ 00:10:10.724 00:10:10.724 00:10:10.724 Suite: bdevio tests on: AIO0 00:10:10.724 Test: blockdev write read block ...passed 00:10:10.724 Test: blockdev write zeroes read block ...passed 00:10:10.724 Test: blockdev write zeroes read no split ...passed 00:10:10.724 Test: blockdev write zeroes read split ...passed 00:10:10.724 Test: blockdev write zeroes read split partial ...passed 00:10:10.724 Test: blockdev reset ...passed 00:10:10.724 Test: blockdev write read 8 blocks ...passed 00:10:10.724 Test: blockdev write read size > 128k ...passed 00:10:10.724 Test: blockdev write read invalid size ...passed 00:10:10.724 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.724 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.724 Test: blockdev write read max offset ...passed 00:10:10.724 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.724 Test: blockdev writev readv 8 blocks ...passed 00:10:10.724 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.724 Test: blockdev writev readv block ...passed 00:10:10.724 Test: blockdev writev readv size > 128k ...passed 00:10:10.724 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.724 Test: blockdev comparev and writev ...passed 00:10:10.724 Test: blockdev nvme passthru rw ...passed 00:10:10.724 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.724 Test: blockdev nvme admin passthru ...passed 00:10:10.724 Test: blockdev copy ...passed 00:10:10.724 Suite: bdevio tests on: raid1 00:10:10.724 Test: blockdev write read block ...passed 00:10:10.724 Test: blockdev write zeroes read block ...passed 00:10:10.724 Test: blockdev write zeroes read no split ...passed 00:10:10.724 Test: blockdev write zeroes read split ...passed 00:10:10.724 Test: blockdev write zeroes read split partial ...passed 00:10:10.724 Test: blockdev reset ...passed 00:10:10.724 Test: blockdev write read 8 blocks ...passed 00:10:10.724 Test: blockdev write read size > 128k ...passed 00:10:10.724 Test: blockdev write read invalid size ...passed 00:10:10.724 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.724 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.724 Test: blockdev write read max offset ...passed 00:10:10.724 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.724 Test: blockdev writev readv 8 blocks ...passed 00:10:10.724 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.724 Test: blockdev writev readv block ...passed 00:10:10.724 Test: blockdev writev readv size > 128k ...passed 00:10:10.724 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.724 Test: blockdev comparev and writev ...passed 00:10:10.724 Test: blockdev nvme passthru rw ...passed 00:10:10.724 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.724 Test: blockdev nvme admin passthru ...passed 00:10:10.724 Test: blockdev copy ...passed 00:10:10.724 Suite: bdevio tests on: concat0 00:10:10.724 Test: blockdev write read block ...passed 00:10:10.724 Test: blockdev write zeroes read block ...passed 00:10:10.724 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: raid0 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: TestPT 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: Malloc2p7 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: Malloc2p6 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: Malloc2p5 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: Malloc2p4 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.725 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.725 Test: blockdev write read max offset ...passed 00:10:10.725 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.725 Test: blockdev writev readv 8 blocks ...passed 00:10:10.725 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.725 Test: blockdev writev readv block ...passed 00:10:10.725 Test: blockdev writev readv size > 128k ...passed 00:10:10.725 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.725 Test: blockdev comparev and writev ...passed 00:10:10.725 Test: blockdev nvme passthru rw ...passed 00:10:10.725 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.725 Test: blockdev nvme admin passthru ...passed 00:10:10.725 Test: blockdev copy ...passed 00:10:10.725 Suite: bdevio tests on: Malloc2p3 00:10:10.725 Test: blockdev write read block ...passed 00:10:10.725 Test: blockdev write zeroes read block ...passed 00:10:10.725 Test: blockdev write zeroes read no split ...passed 00:10:10.725 Test: blockdev write zeroes read split ...passed 00:10:10.725 Test: blockdev write zeroes read split partial ...passed 00:10:10.725 Test: blockdev reset ...passed 00:10:10.725 Test: blockdev write read 8 blocks ...passed 00:10:10.725 Test: blockdev write read size > 128k ...passed 00:10:10.725 Test: blockdev write read invalid size ...passed 00:10:10.725 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.726 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.726 Test: blockdev write read max offset ...passed 00:10:10.726 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.726 Test: blockdev writev readv 8 blocks ...passed 00:10:10.726 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.726 Test: blockdev writev readv block ...passed 00:10:10.726 Test: blockdev writev readv size > 128k ...passed 00:10:10.726 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.726 Test: blockdev comparev and writev ...passed 00:10:10.726 Test: blockdev nvme passthru rw ...passed 00:10:10.726 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.726 Test: blockdev nvme admin passthru ...passed 00:10:10.726 Test: blockdev copy ...passed 00:10:10.726 Suite: bdevio tests on: Malloc2p2 00:10:10.726 Test: blockdev write read block ...passed 00:10:10.726 Test: blockdev write zeroes read block ...passed 00:10:10.726 Test: blockdev write zeroes read no split ...passed 00:10:10.726 Test: blockdev write zeroes read split ...passed 00:10:10.726 Test: blockdev write zeroes read split partial ...passed 00:10:10.726 Test: blockdev reset ...passed 00:10:10.726 Test: blockdev write read 8 blocks ...passed 00:10:10.726 Test: blockdev write read size > 128k ...passed 00:10:10.726 Test: blockdev write read invalid size ...passed 00:10:10.726 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.726 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.726 Test: blockdev write read max offset ...passed 00:10:10.726 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.726 Test: blockdev writev readv 8 blocks ...passed 00:10:10.726 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.726 Test: blockdev writev readv block ...passed 00:10:10.726 Test: blockdev writev readv size > 128k ...passed 00:10:10.726 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.726 Test: blockdev comparev and writev ...passed 00:10:10.726 Test: blockdev nvme passthru rw ...passed 00:10:10.726 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.726 Test: blockdev nvme admin passthru ...passed 00:10:10.726 Test: blockdev copy ...passed 00:10:10.726 Suite: bdevio tests on: Malloc2p1 00:10:10.726 Test: blockdev write read block ...passed 00:10:10.726 Test: blockdev write zeroes read block ...passed 00:10:10.726 Test: blockdev write zeroes read no split ...passed 00:10:10.726 Test: blockdev write zeroes read split ...passed 00:10:10.985 Test: blockdev write zeroes read split partial ...passed 00:10:10.985 Test: blockdev reset ...passed 00:10:10.985 Test: blockdev write read 8 blocks ...passed 00:10:10.985 Test: blockdev write read size > 128k ...passed 00:10:10.985 Test: blockdev write read invalid size ...passed 00:10:10.985 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.985 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.985 Test: blockdev write read max offset ...passed 00:10:10.985 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.985 Test: blockdev writev readv 8 blocks ...passed 00:10:10.985 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.985 Test: blockdev writev readv block ...passed 00:10:10.985 Test: blockdev writev readv size > 128k ...passed 00:10:10.985 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.985 Test: blockdev comparev and writev ...passed 00:10:10.985 Test: blockdev nvme passthru rw ...passed 00:10:10.985 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.985 Test: blockdev nvme admin passthru ...passed 00:10:10.985 Test: blockdev copy ...passed 00:10:10.985 Suite: bdevio tests on: Malloc2p0 00:10:10.985 Test: blockdev write read block ...passed 00:10:10.985 Test: blockdev write zeroes read block ...passed 00:10:10.985 Test: blockdev write zeroes read no split ...passed 00:10:10.985 Test: blockdev write zeroes read split ...passed 00:10:10.985 Test: blockdev write zeroes read split partial ...passed 00:10:10.985 Test: blockdev reset ...passed 00:10:10.985 Test: blockdev write read 8 blocks ...passed 00:10:10.985 Test: blockdev write read size > 128k ...passed 00:10:10.985 Test: blockdev write read invalid size ...passed 00:10:10.985 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.985 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.985 Test: blockdev write read max offset ...passed 00:10:10.985 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.985 Test: blockdev writev readv 8 blocks ...passed 00:10:10.985 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.985 Test: blockdev writev readv block ...passed 00:10:10.985 Test: blockdev writev readv size > 128k ...passed 00:10:10.985 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.985 Test: blockdev comparev and writev ...passed 00:10:10.985 Test: blockdev nvme passthru rw ...passed 00:10:10.985 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.985 Test: blockdev nvme admin passthru ...passed 00:10:10.985 Test: blockdev copy ...passed 00:10:10.985 Suite: bdevio tests on: Malloc1p1 00:10:10.985 Test: blockdev write read block ...passed 00:10:10.985 Test: blockdev write zeroes read block ...passed 00:10:10.985 Test: blockdev write zeroes read no split ...passed 00:10:10.985 Test: blockdev write zeroes read split ...passed 00:10:10.985 Test: blockdev write zeroes read split partial ...passed 00:10:10.985 Test: blockdev reset ...passed 00:10:10.985 Test: blockdev write read 8 blocks ...passed 00:10:10.985 Test: blockdev write read size > 128k ...passed 00:10:10.985 Test: blockdev write read invalid size ...passed 00:10:10.985 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.985 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.985 Test: blockdev write read max offset ...passed 00:10:10.985 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.985 Test: blockdev writev readv 8 blocks ...passed 00:10:10.985 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.985 Test: blockdev writev readv block ...passed 00:10:10.985 Test: blockdev writev readv size > 128k ...passed 00:10:10.985 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.985 Test: blockdev comparev and writev ...passed 00:10:10.985 Test: blockdev nvme passthru rw ...passed 00:10:10.985 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.985 Test: blockdev nvme admin passthru ...passed 00:10:10.985 Test: blockdev copy ...passed 00:10:10.985 Suite: bdevio tests on: Malloc1p0 00:10:10.985 Test: blockdev write read block ...passed 00:10:10.985 Test: blockdev write zeroes read block ...passed 00:10:10.985 Test: blockdev write zeroes read no split ...passed 00:10:10.985 Test: blockdev write zeroes read split ...passed 00:10:10.985 Test: blockdev write zeroes read split partial ...passed 00:10:10.985 Test: blockdev reset ...passed 00:10:10.985 Test: blockdev write read 8 blocks ...passed 00:10:10.985 Test: blockdev write read size > 128k ...passed 00:10:10.985 Test: blockdev write read invalid size ...passed 00:10:10.985 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.985 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.985 Test: blockdev write read max offset ...passed 00:10:10.985 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.985 Test: blockdev writev readv 8 blocks ...passed 00:10:10.985 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.985 Test: blockdev writev readv block ...passed 00:10:10.985 Test: blockdev writev readv size > 128k ...passed 00:10:10.985 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.985 Test: blockdev comparev and writev ...passed 00:10:10.985 Test: blockdev nvme passthru rw ...passed 00:10:10.985 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.985 Test: blockdev nvme admin passthru ...passed 00:10:10.985 Test: blockdev copy ...passed 00:10:10.985 Suite: bdevio tests on: Malloc0 00:10:10.985 Test: blockdev write read block ...passed 00:10:10.985 Test: blockdev write zeroes read block ...passed 00:10:10.985 Test: blockdev write zeroes read no split ...passed 00:10:10.985 Test: blockdev write zeroes read split ...passed 00:10:10.985 Test: blockdev write zeroes read split partial ...passed 00:10:10.985 Test: blockdev reset ...passed 00:10:10.985 Test: blockdev write read 8 blocks ...passed 00:10:10.985 Test: blockdev write read size > 128k ...passed 00:10:10.985 Test: blockdev write read invalid size ...passed 00:10:10.985 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:10.985 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:10.985 Test: blockdev write read max offset ...passed 00:10:10.985 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:10.985 Test: blockdev writev readv 8 blocks ...passed 00:10:10.985 Test: blockdev writev readv 30 x 1block ...passed 00:10:10.985 Test: blockdev writev readv block ...passed 00:10:10.985 Test: blockdev writev readv size > 128k ...passed 00:10:10.985 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:10.985 Test: blockdev comparev and writev ...passed 00:10:10.985 Test: blockdev nvme passthru rw ...passed 00:10:10.985 Test: blockdev nvme passthru vendor specific ...passed 00:10:10.985 Test: blockdev nvme admin passthru ...passed 00:10:10.985 Test: blockdev copy ...passed 00:10:10.985 00:10:10.985 Run Summary: Type Total Ran Passed Failed Inactive 00:10:10.985 suites 16 16 n/a 0 0 00:10:10.985 tests 368 368 368 0 0 00:10:10.985 asserts 2224 2224 2224 0 n/a 00:10:10.985 00:10:10.985 Elapsed time = 0.477 seconds 00:10:10.985 0 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1562031 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1562031 ']' 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1562031 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1562031 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1562031' 00:10:10.985 killing process with pid 1562031 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1562031 00:10:10.985 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1562031 00:10:11.244 07:15:43 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:10:11.244 00:10:11.244 real 0m1.624s 00:10:11.244 user 0m4.105s 00:10:11.244 sys 0m0.450s 00:10:11.244 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:11.244 07:15:43 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:10:11.244 ************************************ 00:10:11.244 END TEST bdev_bounds 00:10:11.244 ************************************ 00:10:11.244 07:15:43 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:10:11.244 07:15:43 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:11.244 07:15:43 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:11.244 07:15:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:11.244 ************************************ 00:10:11.244 START TEST bdev_nbd 00:10:11.244 ************************************ 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1562331 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:10:11.244 07:15:43 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1562331 /var/tmp/spdk-nbd.sock 00:10:11.245 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1562331 ']' 00:10:11.245 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:11.245 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:11.245 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:11.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:11.245 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:11.245 07:15:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:11.504 [2024-07-25 07:15:43.789496] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:11.504 [2024-07-25 07:15:43.789553] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:11.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.504 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:11.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.505 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:11.505 [2024-07-25 07:15:43.923598] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.505 [2024-07-25 07:15:44.010224] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:11.765 [2024-07-25 07:15:44.161482] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:11.765 [2024-07-25 07:15:44.161532] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:11.765 [2024-07-25 07:15:44.161546] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:11.765 [2024-07-25 07:15:44.169494] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:11.765 [2024-07-25 07:15:44.169520] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:11.765 [2024-07-25 07:15:44.177505] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:11.765 [2024-07-25 07:15:44.177528] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:11.765 [2024-07-25 07:15:44.248516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:11.765 [2024-07-25 07:15:44.248562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:11.765 [2024-07-25 07:15:44.248578] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf90b70 00:10:11.765 [2024-07-25 07:15:44.248589] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:11.765 [2024-07-25 07:15:44.249893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:11.765 [2024-07-25 07:15:44.249920] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.334 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.593 1+0 records in 00:10:12.593 1+0 records out 00:10:12.593 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232107 s, 17.6 MB/s 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.593 07:15:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.852 1+0 records in 00:10:12.852 1+0 records out 00:10:12.852 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256636 s, 16.0 MB/s 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.852 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.111 1+0 records in 00:10:13.111 1+0 records out 00:10:13.111 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032732 s, 12.5 MB/s 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.111 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:10:13.370 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.371 1+0 records in 00:10:13.371 1+0 records out 00:10:13.371 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328187 s, 12.5 MB/s 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.371 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:10:13.630 07:15:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.630 1+0 records in 00:10:13.630 1+0 records out 00:10:13.630 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000364193 s, 11.2 MB/s 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.630 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:13.888 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.888 1+0 records in 00:10:13.888 1+0 records out 00:10:13.889 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000427875 s, 9.6 MB/s 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.889 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:10:14.147 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:10:14.147 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:10:14.147 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.148 1+0 records in 00:10:14.148 1+0 records out 00:10:14.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000443228 s, 9.2 MB/s 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.148 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:10:14.415 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.416 1+0 records in 00:10:14.416 1+0 records out 00:10:14.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000437118 s, 9.4 MB/s 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.416 07:15:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.720 1+0 records in 00:10:14.720 1+0 records out 00:10:14.720 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000538557 s, 7.6 MB/s 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.720 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.721 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:10:14.991 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:10:14.991 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:10:14.991 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:10:14.991 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:10:14.991 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:14.991 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.992 1+0 records in 00:10:14.992 1+0 records out 00:10:14.992 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000414556 s, 9.9 MB/s 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.992 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:15.250 1+0 records in 00:10:15.250 1+0 records out 00:10:15.250 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000515918 s, 7.9 MB/s 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:15.250 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:15.509 1+0 records in 00:10:15.509 1+0 records out 00:10:15.509 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000506201 s, 8.1 MB/s 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:15.509 07:15:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:15.768 1+0 records in 00:10:15.768 1+0 records out 00:10:15.768 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000525441 s, 7.8 MB/s 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:15.768 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:16.027 1+0 records in 00:10:16.027 1+0 records out 00:10:16.027 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000479171 s, 8.5 MB/s 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:16.027 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:16.285 1+0 records in 00:10:16.285 1+0 records out 00:10:16.285 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000775484 s, 5.3 MB/s 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:16.285 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:10:16.542 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:16.543 1+0 records in 00:10:16.543 1+0 records out 00:10:16.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000527888 s, 7.8 MB/s 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:16.543 07:15:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:16.800 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd0", 00:10:16.800 "bdev_name": "Malloc0" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd1", 00:10:16.800 "bdev_name": "Malloc1p0" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd2", 00:10:16.800 "bdev_name": "Malloc1p1" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd3", 00:10:16.800 "bdev_name": "Malloc2p0" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd4", 00:10:16.800 "bdev_name": "Malloc2p1" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd5", 00:10:16.800 "bdev_name": "Malloc2p2" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd6", 00:10:16.800 "bdev_name": "Malloc2p3" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd7", 00:10:16.800 "bdev_name": "Malloc2p4" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd8", 00:10:16.800 "bdev_name": "Malloc2p5" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd9", 00:10:16.800 "bdev_name": "Malloc2p6" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd10", 00:10:16.800 "bdev_name": "Malloc2p7" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd11", 00:10:16.800 "bdev_name": "TestPT" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd12", 00:10:16.800 "bdev_name": "raid0" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd13", 00:10:16.800 "bdev_name": "concat0" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd14", 00:10:16.800 "bdev_name": "raid1" 00:10:16.800 }, 00:10:16.800 { 00:10:16.800 "nbd_device": "/dev/nbd15", 00:10:16.800 "bdev_name": "AIO0" 00:10:16.801 } 00:10:16.801 ]' 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd0", 00:10:16.801 "bdev_name": "Malloc0" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd1", 00:10:16.801 "bdev_name": "Malloc1p0" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd2", 00:10:16.801 "bdev_name": "Malloc1p1" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd3", 00:10:16.801 "bdev_name": "Malloc2p0" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd4", 00:10:16.801 "bdev_name": "Malloc2p1" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd5", 00:10:16.801 "bdev_name": "Malloc2p2" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd6", 00:10:16.801 "bdev_name": "Malloc2p3" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd7", 00:10:16.801 "bdev_name": "Malloc2p4" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd8", 00:10:16.801 "bdev_name": "Malloc2p5" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd9", 00:10:16.801 "bdev_name": "Malloc2p6" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd10", 00:10:16.801 "bdev_name": "Malloc2p7" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd11", 00:10:16.801 "bdev_name": "TestPT" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd12", 00:10:16.801 "bdev_name": "raid0" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd13", 00:10:16.801 "bdev_name": "concat0" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd14", 00:10:16.801 "bdev_name": "raid1" 00:10:16.801 }, 00:10:16.801 { 00:10:16.801 "nbd_device": "/dev/nbd15", 00:10:16.801 "bdev_name": "AIO0" 00:10:16.801 } 00:10:16.801 ]' 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.801 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.058 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.315 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.572 07:15:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.572 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.830 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.087 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.344 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.602 07:15:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.859 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:19.117 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:19.374 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:19.631 07:15:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:19.889 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:20.147 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:20.406 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:20.665 07:15:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:20.665 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:20.924 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:20.924 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:20.924 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:21.182 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:21.183 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:21.183 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:21.183 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:10:21.183 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:21.183 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.183 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:10:21.183 /dev/nbd0 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.441 1+0 records in 00:10:21.441 1+0 records out 00:10:21.441 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024496 s, 16.7 MB/s 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.441 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.442 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:10:21.442 /dev/nbd1 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.700 1+0 records in 00:10:21.700 1+0 records out 00:10:21.700 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301533 s, 13.6 MB/s 00:10:21.700 07:15:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.700 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:10:21.700 /dev/nbd10 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.960 1+0 records in 00:10:21.960 1+0 records out 00:10:21.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319976 s, 12.8 MB/s 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.960 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:10:21.960 /dev/nbd11 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.219 1+0 records in 00:10:22.219 1+0 records out 00:10:22.219 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332204 s, 12.3 MB/s 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.219 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:10:22.478 /dev/nbd12 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.479 1+0 records in 00:10:22.479 1+0 records out 00:10:22.479 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304015 s, 13.5 MB/s 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.479 07:15:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:10:22.737 /dev/nbd13 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.737 1+0 records in 00:10:22.737 1+0 records out 00:10:22.737 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000427138 s, 9.6 MB/s 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.737 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:10:22.995 /dev/nbd14 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.995 1+0 records in 00:10:22.995 1+0 records out 00:10:22.995 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000437169 s, 9.4 MB/s 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:22.995 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.996 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.996 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:10:23.255 /dev/nbd15 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.255 1+0 records in 00:10:23.255 1+0 records out 00:10:23.255 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000469838 s, 8.7 MB/s 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.255 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:10:23.514 /dev/nbd2 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.514 1+0 records in 00:10:23.514 1+0 records out 00:10:23.514 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000501085 s, 8.2 MB/s 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.514 07:15:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:10:23.773 /dev/nbd3 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.773 1+0 records in 00:10:23.773 1+0 records out 00:10:23.773 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000509615 s, 8.0 MB/s 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.773 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:10:24.032 /dev/nbd4 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.032 1+0 records in 00:10:24.032 1+0 records out 00:10:24.032 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000605701 s, 6.8 MB/s 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:24.032 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:10:24.291 /dev/nbd5 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.291 1+0 records in 00:10:24.291 1+0 records out 00:10:24.291 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000379674 s, 10.8 MB/s 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:24.291 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:10:24.550 /dev/nbd6 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.550 1+0 records in 00:10:24.550 1+0 records out 00:10:24.550 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000558757 s, 7.3 MB/s 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:24.550 07:15:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:10:24.809 /dev/nbd7 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.809 1+0 records in 00:10:24.809 1+0 records out 00:10:24.809 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000487321 s, 8.4 MB/s 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:24.809 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:10:25.068 /dev/nbd8 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:25.068 1+0 records in 00:10:25.068 1+0 records out 00:10:25.068 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000707572 s, 5.8 MB/s 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:25.068 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:10:25.327 /dev/nbd9 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:25.327 1+0 records in 00:10:25.327 1+0 records out 00:10:25.327 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00072378 s, 5.7 MB/s 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:25.327 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:25.586 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd0", 00:10:25.586 "bdev_name": "Malloc0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd1", 00:10:25.586 "bdev_name": "Malloc1p0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd10", 00:10:25.586 "bdev_name": "Malloc1p1" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd11", 00:10:25.586 "bdev_name": "Malloc2p0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd12", 00:10:25.586 "bdev_name": "Malloc2p1" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd13", 00:10:25.586 "bdev_name": "Malloc2p2" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd14", 00:10:25.586 "bdev_name": "Malloc2p3" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd15", 00:10:25.586 "bdev_name": "Malloc2p4" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd2", 00:10:25.586 "bdev_name": "Malloc2p5" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd3", 00:10:25.586 "bdev_name": "Malloc2p6" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd4", 00:10:25.586 "bdev_name": "Malloc2p7" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd5", 00:10:25.586 "bdev_name": "TestPT" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd6", 00:10:25.586 "bdev_name": "raid0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd7", 00:10:25.586 "bdev_name": "concat0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd8", 00:10:25.586 "bdev_name": "raid1" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd9", 00:10:25.586 "bdev_name": "AIO0" 00:10:25.586 } 00:10:25.586 ]' 00:10:25.586 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd0", 00:10:25.586 "bdev_name": "Malloc0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd1", 00:10:25.586 "bdev_name": "Malloc1p0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd10", 00:10:25.586 "bdev_name": "Malloc1p1" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd11", 00:10:25.586 "bdev_name": "Malloc2p0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd12", 00:10:25.586 "bdev_name": "Malloc2p1" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd13", 00:10:25.586 "bdev_name": "Malloc2p2" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd14", 00:10:25.586 "bdev_name": "Malloc2p3" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd15", 00:10:25.586 "bdev_name": "Malloc2p4" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd2", 00:10:25.586 "bdev_name": "Malloc2p5" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd3", 00:10:25.586 "bdev_name": "Malloc2p6" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd4", 00:10:25.586 "bdev_name": "Malloc2p7" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd5", 00:10:25.586 "bdev_name": "TestPT" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd6", 00:10:25.586 "bdev_name": "raid0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd7", 00:10:25.586 "bdev_name": "concat0" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd8", 00:10:25.586 "bdev_name": "raid1" 00:10:25.586 }, 00:10:25.586 { 00:10:25.586 "nbd_device": "/dev/nbd9", 00:10:25.586 "bdev_name": "AIO0" 00:10:25.586 } 00:10:25.586 ]' 00:10:25.586 07:15:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:25.586 /dev/nbd1 00:10:25.586 /dev/nbd10 00:10:25.586 /dev/nbd11 00:10:25.586 /dev/nbd12 00:10:25.586 /dev/nbd13 00:10:25.586 /dev/nbd14 00:10:25.586 /dev/nbd15 00:10:25.586 /dev/nbd2 00:10:25.586 /dev/nbd3 00:10:25.586 /dev/nbd4 00:10:25.586 /dev/nbd5 00:10:25.586 /dev/nbd6 00:10:25.586 /dev/nbd7 00:10:25.586 /dev/nbd8 00:10:25.586 /dev/nbd9' 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:25.586 /dev/nbd1 00:10:25.586 /dev/nbd10 00:10:25.586 /dev/nbd11 00:10:25.586 /dev/nbd12 00:10:25.586 /dev/nbd13 00:10:25.586 /dev/nbd14 00:10:25.586 /dev/nbd15 00:10:25.586 /dev/nbd2 00:10:25.586 /dev/nbd3 00:10:25.586 /dev/nbd4 00:10:25.586 /dev/nbd5 00:10:25.586 /dev/nbd6 00:10:25.586 /dev/nbd7 00:10:25.586 /dev/nbd8 00:10:25.586 /dev/nbd9' 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:25.586 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:10:25.586 256+0 records in 00:10:25.587 256+0 records out 00:10:25.587 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010415 s, 101 MB/s 00:10:25.587 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.587 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:25.846 256+0 records in 00:10:25.846 256+0 records out 00:10:25.846 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.132594 s, 7.9 MB/s 00:10:25.846 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.847 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:25.847 256+0 records in 00:10:25.847 256+0 records out 00:10:25.847 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167979 s, 6.2 MB/s 00:10:25.847 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.847 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:10:26.140 256+0 records in 00:10:26.140 256+0 records out 00:10:26.140 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167803 s, 6.2 MB/s 00:10:26.140 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.140 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:10:26.398 256+0 records in 00:10:26.398 256+0 records out 00:10:26.398 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167602 s, 6.3 MB/s 00:10:26.398 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.399 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:10:26.399 256+0 records in 00:10:26.399 256+0 records out 00:10:26.399 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168035 s, 6.2 MB/s 00:10:26.399 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.399 07:15:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:10:26.657 256+0 records in 00:10:26.657 256+0 records out 00:10:26.657 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.139396 s, 7.5 MB/s 00:10:26.657 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.657 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:10:26.915 256+0 records in 00:10:26.915 256+0 records out 00:10:26.915 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.16722 s, 6.3 MB/s 00:10:26.915 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.915 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:10:26.915 256+0 records in 00:10:26.915 256+0 records out 00:10:26.915 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168524 s, 6.2 MB/s 00:10:26.915 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.915 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:10:27.173 256+0 records in 00:10:27.173 256+0 records out 00:10:27.173 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122374 s, 8.6 MB/s 00:10:27.173 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.173 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:10:27.173 256+0 records in 00:10:27.173 256+0 records out 00:10:27.173 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167834 s, 6.2 MB/s 00:10:27.173 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.173 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:10:27.430 256+0 records in 00:10:27.430 256+0 records out 00:10:27.430 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167079 s, 6.3 MB/s 00:10:27.430 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.430 07:15:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:10:27.688 256+0 records in 00:10:27.688 256+0 records out 00:10:27.688 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.164405 s, 6.4 MB/s 00:10:27.688 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.688 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:10:27.688 256+0 records in 00:10:27.688 256+0 records out 00:10:27.688 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.161683 s, 6.5 MB/s 00:10:27.688 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.688 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:10:27.945 256+0 records in 00:10:27.945 256+0 records out 00:10:27.945 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.123859 s, 8.5 MB/s 00:10:27.945 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.945 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:10:27.945 256+0 records in 00:10:27.945 256+0 records out 00:10:27.945 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.10327 s, 10.2 MB/s 00:10:27.945 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.945 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:10:28.203 256+0 records in 00:10:28.203 256+0 records out 00:10:28.203 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0986469 s, 10.6 MB/s 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.203 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.461 07:16:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.719 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.980 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.238 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.496 07:16:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.753 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.011 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.012 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.269 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.527 07:16:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:30.800 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.067 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.331 07:16:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.589 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.847 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:32.105 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:10:32.363 07:16:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:10:32.621 malloc_lvol_verify 00:10:32.621 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:10:32.878 9a7ed5f9-0684-4e29-876d-34fd883470b7 00:10:32.878 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:10:33.137 88bc3f5c-78a1-4fcf-84d5-c2035378c48a 00:10:33.137 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:10:33.395 /dev/nbd0 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:10:33.395 mke2fs 1.46.5 (30-Dec-2021) 00:10:33.395 Discarding device blocks: 0/4096 done 00:10:33.395 Creating filesystem with 4096 1k blocks and 1024 inodes 00:10:33.395 00:10:33.395 Allocating group tables: 0/1 done 00:10:33.395 Writing inode tables: 0/1 done 00:10:33.395 Creating journal (1024 blocks): done 00:10:33.395 Writing superblocks and filesystem accounting information: 0/1 done 00:10:33.395 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:33.395 07:16:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1562331 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1562331 ']' 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1562331 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1562331 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1562331' 00:10:33.654 killing process with pid 1562331 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1562331 00:10:33.654 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1562331 00:10:33.912 07:16:06 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:10:33.912 00:10:33.912 real 0m22.641s 00:10:33.912 user 0m27.844s 00:10:33.912 sys 0m13.064s 00:10:33.912 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:33.912 07:16:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:33.912 ************************************ 00:10:33.912 END TEST bdev_nbd 00:10:33.912 ************************************ 00:10:33.912 07:16:06 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:10:33.912 07:16:06 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:10:33.912 07:16:06 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:10:33.912 07:16:06 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:10:33.912 07:16:06 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:33.912 07:16:06 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:33.912 07:16:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:34.170 ************************************ 00:10:34.170 START TEST bdev_fio 00:10:34.170 ************************************ 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:34.170 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:10:34.170 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:34.171 07:16:06 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:34.171 ************************************ 00:10:34.171 START TEST bdev_fio_rw_verify 00:10:34.171 ************************************ 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:34.171 07:16:06 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.749 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:34.749 fio-3.35 00:10:34.749 Starting 16 threads 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:34.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:34.749 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:46.952 00:10:46.952 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1567008: Thu Jul 25 07:16:17 2024 00:10:46.952 read: IOPS=105k, BW=411MiB/s (431MB/s)(4108MiB/10001msec) 00:10:46.952 slat (usec): min=2, max=1314, avg=29.57, stdev=13.77 00:10:46.952 clat (usec): min=9, max=4253, avg=242.74, stdev=122.49 00:10:46.952 lat (usec): min=16, max=4289, avg=272.31, stdev=129.80 00:10:46.952 clat percentiles (usec): 00:10:46.952 | 50.000th=[ 233], 99.000th=[ 619], 99.900th=[ 725], 99.990th=[ 807], 00:10:46.952 | 99.999th=[ 1663] 00:10:46.952 write: IOPS=165k, BW=646MiB/s (677MB/s)(6379MiB/9878msec); 0 zone resets 00:10:46.952 slat (usec): min=4, max=622, avg=42.25, stdev=13.51 00:10:46.952 clat (usec): min=11, max=1685, avg=290.86, stdev=138.32 00:10:46.952 lat (usec): min=31, max=1701, avg=333.11, stdev=144.90 00:10:46.952 clat percentiles (usec): 00:10:46.952 | 50.000th=[ 277], 99.000th=[ 701], 99.900th=[ 791], 99.990th=[ 840], 00:10:46.952 | 99.999th=[ 1057] 00:10:46.952 bw ( KiB/s): min=553808, max=806265, per=99.50%, avg=657959.00, stdev=4601.77, samples=304 00:10:46.952 iops : min=138452, max=201566, avg=164489.68, stdev=1150.44, samples=304 00:10:46.952 lat (usec) : 10=0.01%, 20=0.01%, 50=1.08%, 100=6.44%, 250=40.49% 00:10:46.952 lat (usec) : 500=46.66%, 750=5.07%, 1000=0.23% 00:10:46.952 lat (msec) : 2=0.01%, 10=0.01% 00:10:46.952 cpu : usr=99.20%, sys=0.41%, ctx=696, majf=0, minf=2690 00:10:46.952 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:46.952 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:46.952 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:46.952 issued rwts: total=1051706,1633061,0,0 short=0,0,0,0 dropped=0,0,0,0 00:10:46.952 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:46.952 00:10:46.952 Run status group 0 (all jobs): 00:10:46.952 READ: bw=411MiB/s (431MB/s), 411MiB/s-411MiB/s (431MB/s-431MB/s), io=4108MiB (4308MB), run=10001-10001msec 00:10:46.952 WRITE: bw=646MiB/s (677MB/s), 646MiB/s-646MiB/s (677MB/s-677MB/s), io=6379MiB (6689MB), run=9878-9878msec 00:10:46.952 00:10:46.952 real 0m11.538s 00:10:46.952 user 2m52.187s 00:10:46.952 sys 0m1.559s 00:10:46.952 07:16:18 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:46.952 07:16:18 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:10:46.952 ************************************ 00:10:46.952 END TEST bdev_fio_rw_verify 00:10:46.952 ************************************ 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:10:46.952 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:46.954 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ed3ac8fc-f7f7-48ca-ac63-999d8023b77c"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ed3ac8fc-f7f7-48ca-ac63-999d8023b77c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "1a93c10b-5432-5a78-957f-77010344738e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "1a93c10b-5432-5a78-957f-77010344738e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "b12a7de3-4bf0-5deb-ba42-dc966f2eedba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b12a7de3-4bf0-5deb-ba42-dc966f2eedba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "3562a809-2794-5941-bbaa-30d087ad9865"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3562a809-2794-5941-bbaa-30d087ad9865",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "37067070-100d-577c-ac50-7ca2f2be8d90"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "37067070-100d-577c-ac50-7ca2f2be8d90",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "5301bb85-d605-5559-802f-5723acc73aa4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5301bb85-d605-5559-802f-5723acc73aa4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "a091c216-71c7-5418-b3d3-21203eafaa11"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a091c216-71c7-5418-b3d3-21203eafaa11",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "dc1cc375-f4d2-53a6-8c7a-6458066d8d83"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "dc1cc375-f4d2-53a6-8c7a-6458066d8d83",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "a9e542f7-0a6a-5502-a926-602c78668cb4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a9e542f7-0a6a-5502-a926-602c78668cb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c1a6413b-bc92-5f30-b69e-f46122fadc9d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c1a6413b-bc92-5f30-b69e-f46122fadc9d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "1078a708-cda8-5308-92d4-0dbb0b48248f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1078a708-cda8-5308-92d4-0dbb0b48248f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "3b926705-49d8-5fb4-b003-368daab27267"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3b926705-49d8-5fb4-b003-368daab27267",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "945bcb1b-b0b0-440c-a11b-c27dff298311"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "945bcb1b-b0b0-440c-a11b-c27dff298311",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "945bcb1b-b0b0-440c-a11b-c27dff298311",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ad15b13a-fc18-4c61-b47d-3bc9c3285e6e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "8a91ebcd-167d-410e-a998-3777fef8bd66",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "386496ec-cb13-444c-a4af-a1669d4b7554"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "386496ec-cb13-444c-a4af-a1669d4b7554",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "386496ec-cb13-444c-a4af-a1669d4b7554",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "6202f513-d7cf-406a-8219-930d1538ce50",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "d415d0b3-7c73-478f-8c4c-2edf39e6de3a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "aa47f465-f753-4eb2-9aba-656e55c06000",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "f70f4e42-ee8b-411f-80e8-bc604309b45f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "239b5705-83ee-4f12-9173-ff9d84d7aea7"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "239b5705-83ee-4f12-9173-ff9d84d7aea7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:46.954 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:10:46.954 Malloc1p0 00:10:46.954 Malloc1p1 00:10:46.954 Malloc2p0 00:10:46.954 Malloc2p1 00:10:46.954 Malloc2p2 00:10:46.954 Malloc2p3 00:10:46.954 Malloc2p4 00:10:46.954 Malloc2p5 00:10:46.954 Malloc2p6 00:10:46.954 Malloc2p7 00:10:46.954 TestPT 00:10:46.954 raid0 00:10:46.954 concat0 ]] 00:10:46.954 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ed3ac8fc-f7f7-48ca-ac63-999d8023b77c"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ed3ac8fc-f7f7-48ca-ac63-999d8023b77c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "1a93c10b-5432-5a78-957f-77010344738e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "1a93c10b-5432-5a78-957f-77010344738e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "b12a7de3-4bf0-5deb-ba42-dc966f2eedba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b12a7de3-4bf0-5deb-ba42-dc966f2eedba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "3562a809-2794-5941-bbaa-30d087ad9865"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3562a809-2794-5941-bbaa-30d087ad9865",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "37067070-100d-577c-ac50-7ca2f2be8d90"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "37067070-100d-577c-ac50-7ca2f2be8d90",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "5301bb85-d605-5559-802f-5723acc73aa4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5301bb85-d605-5559-802f-5723acc73aa4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "a091c216-71c7-5418-b3d3-21203eafaa11"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a091c216-71c7-5418-b3d3-21203eafaa11",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "dc1cc375-f4d2-53a6-8c7a-6458066d8d83"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "dc1cc375-f4d2-53a6-8c7a-6458066d8d83",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "a9e542f7-0a6a-5502-a926-602c78668cb4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a9e542f7-0a6a-5502-a926-602c78668cb4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c1a6413b-bc92-5f30-b69e-f46122fadc9d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c1a6413b-bc92-5f30-b69e-f46122fadc9d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "1078a708-cda8-5308-92d4-0dbb0b48248f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1078a708-cda8-5308-92d4-0dbb0b48248f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "3b926705-49d8-5fb4-b003-368daab27267"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3b926705-49d8-5fb4-b003-368daab27267",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "945bcb1b-b0b0-440c-a11b-c27dff298311"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "945bcb1b-b0b0-440c-a11b-c27dff298311",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "945bcb1b-b0b0-440c-a11b-c27dff298311",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "ad15b13a-fc18-4c61-b47d-3bc9c3285e6e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "8a91ebcd-167d-410e-a998-3777fef8bd66",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "386496ec-cb13-444c-a4af-a1669d4b7554"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "386496ec-cb13-444c-a4af-a1669d4b7554",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "386496ec-cb13-444c-a4af-a1669d4b7554",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "6202f513-d7cf-406a-8219-930d1538ce50",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "d415d0b3-7c73-478f-8c4c-2edf39e6de3a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c08d92e4-fe9c-4fdf-8aa2-5009fab268b7",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "aa47f465-f753-4eb2-9aba-656e55c06000",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "f70f4e42-ee8b-411f-80e8-bc604309b45f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "239b5705-83ee-4f12-9173-ff9d84d7aea7"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "239b5705-83ee-4f12-9173-ff9d84d7aea7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.955 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:46.956 07:16:18 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:46.956 ************************************ 00:10:46.956 START TEST bdev_fio_trim 00:10:46.956 ************************************ 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:46.956 07:16:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:46.956 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:46.956 fio-3.35 00:10:46.956 Starting 14 threads 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:46.956 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.956 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:46.957 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.957 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:59.224 00:10:59.224 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1569211: Thu Jul 25 07:16:29 2024 00:10:59.224 write: IOPS=145k, BW=566MiB/s (593MB/s)(5658MiB/10002msec); 0 zone resets 00:10:59.224 slat (usec): min=3, max=909, avg=34.39, stdev=11.89 00:10:59.224 clat (usec): min=32, max=3555, avg=239.56, stdev=95.61 00:10:59.224 lat (usec): min=48, max=3600, avg=273.95, stdev=101.89 00:10:59.224 clat percentiles (usec): 00:10:59.224 | 50.000th=[ 227], 99.000th=[ 529], 99.900th=[ 693], 99.990th=[ 766], 00:10:59.224 | 99.999th=[ 947] 00:10:59.224 bw ( KiB/s): min=473024, max=766827, per=100.00%, avg=583787.11, stdev=6816.30, samples=266 00:10:59.224 iops : min=118256, max=191704, avg=145946.63, stdev=1704.06, samples=266 00:10:59.224 trim: IOPS=145k, BW=566MiB/s (593MB/s)(5658MiB/10002msec); 0 zone resets 00:10:59.224 slat (usec): min=4, max=292, avg=23.80, stdev= 7.90 00:10:59.224 clat (usec): min=6, max=3600, avg=272.06, stdev=104.00 00:10:59.224 lat (usec): min=18, max=3622, avg=295.86, stdev=108.34 00:10:59.224 clat percentiles (usec): 00:10:59.224 | 50.000th=[ 260], 99.000th=[ 586], 99.900th=[ 766], 99.990th=[ 840], 00:10:59.224 | 99.999th=[ 971] 00:10:59.224 bw ( KiB/s): min=473024, max=766827, per=100.00%, avg=583787.11, stdev=6816.33, samples=266 00:10:59.224 iops : min=118256, max=191704, avg=145946.63, stdev=1704.07, samples=266 00:10:59.224 lat (usec) : 10=0.01%, 20=0.01%, 50=0.07%, 100=2.28%, 250=50.59% 00:10:59.224 lat (usec) : 500=45.08%, 750=1.89%, 1000=0.08% 00:10:59.224 lat (msec) : 2=0.01%, 4=0.01% 00:10:59.224 cpu : usr=99.61%, sys=0.01%, ctx=647, majf=0, minf=1028 00:10:59.224 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:59.224 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:59.224 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:59.224 issued rwts: total=0,1448534,1448536,0 short=0,0,0,0 dropped=0,0,0,0 00:10:59.224 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:59.224 00:10:59.224 Run status group 0 (all jobs): 00:10:59.224 WRITE: bw=566MiB/s (593MB/s), 566MiB/s-566MiB/s (593MB/s-593MB/s), io=5658MiB (5933MB), run=10002-10002msec 00:10:59.224 TRIM: bw=566MiB/s (593MB/s), 566MiB/s-566MiB/s (593MB/s-593MB/s), io=5658MiB (5933MB), run=10002-10002msec 00:10:59.224 00:10:59.224 real 0m11.542s 00:10:59.224 user 2m34.132s 00:10:59.224 sys 0m0.642s 00:10:59.224 07:16:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:59.224 07:16:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:10:59.224 ************************************ 00:10:59.224 END TEST bdev_fio_trim 00:10:59.224 ************************************ 00:10:59.224 07:16:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:10:59.224 07:16:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:59.224 07:16:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:10:59.224 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:59.224 07:16:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:10:59.224 00:10:59.224 real 0m23.452s 00:10:59.224 user 5m26.524s 00:10:59.224 sys 0m2.402s 00:10:59.224 07:16:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:59.224 07:16:29 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:59.224 ************************************ 00:10:59.224 END TEST bdev_fio 00:10:59.224 ************************************ 00:10:59.224 07:16:29 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:59.224 07:16:29 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:59.224 07:16:29 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:10:59.224 07:16:29 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:59.224 07:16:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:59.224 ************************************ 00:10:59.224 START TEST bdev_verify 00:10:59.224 ************************************ 00:10:59.224 07:16:29 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:59.224 [2024-07-25 07:16:30.044046] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:10:59.224 [2024-07-25 07:16:30.044105] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571021 ] 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.224 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:59.224 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.225 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:59.225 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.225 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:59.225 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.225 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:59.225 [2024-07-25 07:16:30.175331] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:59.225 [2024-07-25 07:16:30.259095] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:10:59.225 [2024-07-25 07:16:30.259099] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:59.225 [2024-07-25 07:16:30.416326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:59.225 [2024-07-25 07:16:30.416377] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:59.225 [2024-07-25 07:16:30.416390] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:59.225 [2024-07-25 07:16:30.424334] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:59.225 [2024-07-25 07:16:30.424358] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:59.225 [2024-07-25 07:16:30.432364] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:59.225 [2024-07-25 07:16:30.432387] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:59.225 [2024-07-25 07:16:30.503940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:59.225 [2024-07-25 07:16:30.503989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.225 [2024-07-25 07:16:30.504004] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x206d5c0 00:10:59.225 [2024-07-25 07:16:30.504016] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.225 [2024-07-25 07:16:30.505324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.225 [2024-07-25 07:16:30.505352] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:59.225 Running I/O for 5 seconds... 00:11:04.497 00:11:04.497 Latency(us) 00:11:04.497 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:04.497 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.497 Verification LBA range: start 0x0 length 0x1000 00:11:04.497 Malloc0 : 5.05 1342.19 5.24 0.00 0.00 95162.75 491.52 243269.63 00:11:04.497 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.497 Verification LBA range: start 0x1000 length 0x1000 00:11:04.497 Malloc0 : 5.13 1322.76 5.17 0.00 0.00 96564.92 494.80 377487.36 00:11:04.498 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x800 00:11:04.498 Malloc1p0 : 5.21 687.88 2.69 0.00 0.00 185056.14 3434.09 226492.42 00:11:04.498 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x800 length 0x800 00:11:04.498 Malloc1p0 : 5.17 693.41 2.71 0.00 0.00 183593.00 3434.09 214748.36 00:11:04.498 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x800 00:11:04.498 Malloc1p1 : 5.21 687.63 2.69 0.00 0.00 184597.60 3342.34 221459.25 00:11:04.498 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x800 length 0x800 00:11:04.498 Malloc1p1 : 5.17 693.13 2.71 0.00 0.00 183138.58 3342.34 209715.20 00:11:04.498 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p0 : 5.21 687.37 2.69 0.00 0.00 184124.72 3198.16 218103.81 00:11:04.498 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p0 : 5.17 692.84 2.71 0.00 0.00 182671.77 3171.94 204682.04 00:11:04.498 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p1 : 5.22 687.12 2.68 0.00 0.00 183684.63 3250.59 212231.78 00:11:04.498 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p1 : 5.17 692.56 2.71 0.00 0.00 182254.58 3237.48 199648.87 00:11:04.498 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p2 : 5.22 686.87 2.68 0.00 0.00 183240.64 3224.37 206359.76 00:11:04.498 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p2 : 5.18 692.28 2.70 0.00 0.00 181820.55 3289.91 194615.71 00:11:04.498 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p3 : 5.22 686.62 2.68 0.00 0.00 182773.73 3171.94 202165.45 00:11:04.498 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p3 : 5.18 692.00 2.70 0.00 0.00 181356.97 3185.05 188743.68 00:11:04.498 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p4 : 5.22 686.34 2.68 0.00 0.00 182313.78 3224.37 196293.43 00:11:04.498 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p4 : 5.18 691.72 2.70 0.00 0.00 180901.43 3263.69 183710.52 00:11:04.498 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p5 : 5.22 686.07 2.68 0.00 0.00 181875.04 3250.59 192099.12 00:11:04.498 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p5 : 5.18 691.43 2.70 0.00 0.00 180455.16 3289.91 179516.21 00:11:04.498 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p6 : 5.23 685.79 2.68 0.00 0.00 181424.02 3250.59 189582.54 00:11:04.498 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p6 : 5.19 691.15 2.70 0.00 0.00 180002.71 3263.69 173644.19 00:11:04.498 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x200 00:11:04.498 Malloc2p7 : 5.23 685.51 2.68 0.00 0.00 181028.02 3342.34 189582.54 00:11:04.498 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x200 length 0x200 00:11:04.498 Malloc2p7 : 5.19 690.87 2.70 0.00 0.00 179600.23 3355.44 171127.60 00:11:04.498 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x1000 00:11:04.498 TestPT : 5.25 682.33 2.67 0.00 0.00 181216.49 20552.09 189582.54 00:11:04.498 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x1000 length 0x1000 00:11:04.498 TestPT : 5.23 684.82 2.68 0.00 0.00 180581.04 11429.48 251658.24 00:11:04.498 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x2000 00:11:04.498 raid0 : 5.23 684.82 2.68 0.00 0.00 179854.68 3185.05 163577.86 00:11:04.498 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x2000 length 0x2000 00:11:04.498 raid0 : 5.24 708.67 2.77 0.00 0.00 173843.27 3224.37 146800.64 00:11:04.498 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x2000 00:11:04.498 concat0 : 5.26 705.71 2.76 0.00 0.00 174108.29 3237.48 160222.41 00:11:04.498 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x2000 length 0x2000 00:11:04.498 concat0 : 5.24 708.41 2.77 0.00 0.00 173452.19 3263.69 142606.34 00:11:04.498 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x1000 00:11:04.498 raid1 : 5.26 705.47 2.76 0.00 0.00 173748.87 3696.23 156866.97 00:11:04.498 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x1000 length 0x1000 00:11:04.498 raid1 : 5.24 708.14 2.77 0.00 0.00 173073.90 3827.30 139250.89 00:11:04.498 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x0 length 0x4e2 00:11:04.498 AIO0 : 5.26 705.29 2.76 0.00 0.00 173292.02 1474.56 149317.22 00:11:04.498 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.498 Verification LBA range: start 0x4e2 length 0x4e2 00:11:04.498 AIO0 : 5.24 707.95 2.77 0.00 0.00 172642.52 1481.11 143445.20 00:11:04.498 =================================================================================================================== 00:11:04.498 Total : 23455.18 91.62 0.00 0.00 170644.36 491.52 377487.36 00:11:04.498 00:11:04.498 real 0m6.414s 00:11:04.498 user 0m11.924s 00:11:04.498 sys 0m0.369s 00:11:04.498 07:16:36 blockdev_general.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:04.498 07:16:36 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:11:04.498 ************************************ 00:11:04.498 END TEST bdev_verify 00:11:04.498 ************************************ 00:11:04.498 07:16:36 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:11:04.498 07:16:36 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:11:04.498 07:16:36 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:04.498 07:16:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:04.498 ************************************ 00:11:04.498 START TEST bdev_verify_big_io 00:11:04.498 ************************************ 00:11:04.498 07:16:36 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:11:04.498 [2024-07-25 07:16:36.545649] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:04.498 [2024-07-25 07:16:36.545706] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1572210 ] 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.498 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:04.498 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:04.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.499 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:04.499 [2024-07-25 07:16:36.677860] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:04.499 [2024-07-25 07:16:36.761827] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:04.499 [2024-07-25 07:16:36.761833] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:04.499 [2024-07-25 07:16:36.901804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:04.499 [2024-07-25 07:16:36.901848] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:04.499 [2024-07-25 07:16:36.901861] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:04.499 [2024-07-25 07:16:36.909811] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:04.499 [2024-07-25 07:16:36.909835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:04.499 [2024-07-25 07:16:36.917829] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:04.499 [2024-07-25 07:16:36.917852] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:04.499 [2024-07-25 07:16:36.988728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:04.499 [2024-07-25 07:16:36.988775] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:04.499 [2024-07-25 07:16:36.988791] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19075c0 00:11:04.499 [2024-07-25 07:16:36.988802] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:04.499 [2024-07-25 07:16:36.990084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:04.499 [2024-07-25 07:16:36.990113] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:04.759 [2024-07-25 07:16:37.181325] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.182642] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.184557] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.185850] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.187824] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.189207] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.191198] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.193226] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.194445] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.195913] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.196887] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.198365] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.199344] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.200827] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.201803] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.203290] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:11:04.759 [2024-07-25 07:16:37.224045] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:11:04.759 [2024-07-25 07:16:37.225773] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:11:04.759 Running I/O for 5 seconds... 00:11:12.883 00:11:12.883 Latency(us) 00:11:12.883 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:12.883 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x100 00:11:12.883 Malloc0 : 5.74 178.29 11.14 0.00 0.00 703839.38 799.54 1852204.65 00:11:12.883 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x100 length 0x100 00:11:12.883 Malloc0 : 5.73 156.36 9.77 0.00 0.00 802899.56 829.03 2214592.51 00:11:12.883 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x80 00:11:12.883 Malloc1p0 : 6.40 50.66 3.17 0.00 0.00 2294569.07 2451.05 3892314.11 00:11:12.883 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x80 length 0x80 00:11:12.883 Malloc1p0 : 6.18 75.05 4.69 0.00 0.00 1560081.92 2870.48 2644089.24 00:11:12.883 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x80 00:11:12.883 Malloc1p1 : 6.66 36.02 2.25 0.00 0.00 3091885.29 1848.12 5153960.76 00:11:12.883 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x80 length 0x80 00:11:12.883 Malloc1p1 : 6.54 36.72 2.29 0.00 0.00 3073245.46 1821.90 5153960.76 00:11:12.883 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p0 : 6.19 25.87 1.62 0.00 0.00 1097584.51 573.44 2066953.01 00:11:12.883 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p0 : 6.18 25.87 1.62 0.00 0.00 1095872.34 593.10 1892469.96 00:11:12.883 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p1 : 6.19 25.85 1.62 0.00 0.00 1088244.81 560.33 2026687.69 00:11:12.883 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p1 : 6.19 25.87 1.62 0.00 0.00 1086863.58 589.82 1865626.42 00:11:12.883 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p2 : 6.19 25.84 1.61 0.00 0.00 1078865.44 563.61 1999844.15 00:11:12.883 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p2 : 6.19 25.85 1.62 0.00 0.00 1077349.43 583.27 1838782.87 00:11:12.883 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p3 : 6.20 25.82 1.61 0.00 0.00 1069914.12 576.72 1986422.37 00:11:12.883 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p3 : 6.19 25.84 1.61 0.00 0.00 1069018.51 586.55 1825361.10 00:11:12.883 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p4 : 6.20 25.81 1.61 0.00 0.00 1061147.97 566.89 1959578.83 00:11:12.883 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p4 : 6.20 25.82 1.61 0.00 0.00 1060273.96 599.65 1798517.56 00:11:12.883 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p5 : 6.20 25.79 1.61 0.00 0.00 1051527.14 566.89 1932735.28 00:11:12.883 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p5 : 6.20 25.81 1.61 0.00 0.00 1050921.24 586.55 1771674.01 00:11:12.883 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p6 : 6.21 25.79 1.61 0.00 0.00 1042171.67 586.55 1905891.74 00:11:12.883 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p6 : 6.30 27.94 1.75 0.00 0.00 970891.74 599.65 1758252.24 00:11:12.883 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x20 00:11:12.883 Malloc2p7 : 6.30 27.92 1.74 0.00 0.00 961920.68 576.72 1879048.19 00:11:12.883 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x20 length 0x20 00:11:12.883 Malloc2p7 : 6.30 27.93 1.75 0.00 0.00 962442.79 596.38 1731408.69 00:11:12.883 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x100 00:11:12.883 TestPT : 6.81 39.95 2.50 0.00 0.00 2563005.73 1336.93 4804994.66 00:11:12.883 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x100 length 0x100 00:11:12.883 TestPT : 6.49 37.29 2.33 0.00 0.00 2797049.62 96888.42 3516504.47 00:11:12.883 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x200 00:11:12.883 raid0 : 6.82 42.26 2.64 0.00 0.00 2367810.21 1441.79 4617089.84 00:11:12.883 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x200 length 0x200 00:11:12.883 raid0 : 6.44 44.74 2.80 0.00 0.00 2275468.48 1468.01 4617089.84 00:11:12.883 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x0 length 0x200 00:11:12.883 concat0 : 6.81 46.98 2.94 0.00 0.00 2078842.46 1441.79 4456028.57 00:11:12.883 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.883 Verification LBA range: start 0x200 length 0x200 00:11:12.884 concat0 : 6.54 48.93 3.06 0.00 0.00 2017749.29 1461.45 4456028.57 00:11:12.884 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:12.884 Verification LBA range: start 0x0 length 0x100 00:11:12.884 raid1 : 6.81 62.24 3.89 0.00 0.00 1551175.80 1821.90 4294967.30 00:11:12.884 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:12.884 Verification LBA range: start 0x100 length 0x100 00:11:12.884 raid1 : 6.79 63.63 3.98 0.00 0.00 1515964.78 1900.54 4268123.75 00:11:12.884 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:11:12.884 Verification LBA range: start 0x0 length 0x4e 00:11:12.884 AIO0 : 6.82 69.83 4.36 0.00 0.00 819145.08 734.00 2778306.97 00:11:12.884 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:11:12.884 Verification LBA range: start 0x4e length 0x4e 00:11:12.884 AIO0 : 6.79 60.66 3.79 0.00 0.00 945833.72 747.11 2711198.11 00:11:12.884 =================================================================================================================== 00:11:12.884 Total : 1469.20 91.82 0.00 0.00 1428241.00 560.33 5153960.76 00:11:12.884 00:11:12.884 real 0m8.032s 00:11:12.884 user 0m15.121s 00:11:12.884 sys 0m0.393s 00:11:12.884 07:16:44 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:12.884 07:16:44 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:11:12.884 ************************************ 00:11:12.884 END TEST bdev_verify_big_io 00:11:12.884 ************************************ 00:11:12.884 07:16:44 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:12.884 07:16:44 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:12.884 07:16:44 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:12.884 07:16:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:12.884 ************************************ 00:11:12.884 START TEST bdev_write_zeroes 00:11:12.884 ************************************ 00:11:12.884 07:16:44 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:12.884 [2024-07-25 07:16:44.662691] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:12.884 [2024-07-25 07:16:44.662745] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573596 ] 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:12.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:12.884 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:12.884 [2024-07-25 07:16:44.795623] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:12.884 [2024-07-25 07:16:44.878861] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:12.884 [2024-07-25 07:16:45.027540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:12.884 [2024-07-25 07:16:45.027594] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:12.884 [2024-07-25 07:16:45.027607] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:12.884 [2024-07-25 07:16:45.035549] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:12.884 [2024-07-25 07:16:45.035574] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:12.884 [2024-07-25 07:16:45.043558] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:12.884 [2024-07-25 07:16:45.043580] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:12.884 [2024-07-25 07:16:45.114533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:12.884 [2024-07-25 07:16:45.114580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.884 [2024-07-25 07:16:45.114595] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29d6430 00:11:12.884 [2024-07-25 07:16:45.114606] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.884 [2024-07-25 07:16:45.115872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.884 [2024-07-25 07:16:45.115898] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:12.884 Running I/O for 1 seconds... 00:11:14.262 00:11:14.262 Latency(us) 00:11:14.262 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:14.262 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc0 : 1.05 5386.51 21.04 0.00 0.00 23752.88 602.93 39426.46 00:11:14.262 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc1p0 : 1.05 5379.41 21.01 0.00 0.00 23744.76 829.03 38797.31 00:11:14.262 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc1p1 : 1.05 5372.38 20.99 0.00 0.00 23729.79 829.03 37958.45 00:11:14.262 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p0 : 1.05 5365.36 20.96 0.00 0.00 23712.19 829.03 37119.59 00:11:14.262 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p1 : 1.05 5358.29 20.93 0.00 0.00 23695.80 845.41 36280.73 00:11:14.262 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p2 : 1.05 5351.32 20.90 0.00 0.00 23678.97 825.75 35441.87 00:11:14.262 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p3 : 1.05 5344.33 20.88 0.00 0.00 23663.38 825.75 34603.01 00:11:14.262 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p4 : 1.06 5337.32 20.85 0.00 0.00 23645.29 819.20 33764.15 00:11:14.262 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p5 : 1.06 5330.38 20.82 0.00 0.00 23627.30 832.31 32925.29 00:11:14.262 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p6 : 1.06 5323.45 20.79 0.00 0.00 23610.87 835.58 32086.43 00:11:14.262 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 Malloc2p7 : 1.06 5316.50 20.77 0.00 0.00 23593.61 829.03 31247.56 00:11:14.262 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 TestPT : 1.06 5309.62 20.74 0.00 0.00 23575.66 851.97 30408.70 00:11:14.262 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 raid0 : 1.06 5301.75 20.71 0.00 0.00 23551.61 1487.67 28940.70 00:11:14.262 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 concat0 : 1.06 5293.94 20.68 0.00 0.00 23508.15 1474.56 27472.69 00:11:14.262 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 raid1 : 1.07 5284.23 20.64 0.00 0.00 23455.87 2359.30 25060.97 00:11:14.262 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:14.262 AIO0 : 1.07 5278.39 20.62 0.00 0.00 23375.03 943.72 24222.11 00:11:14.262 =================================================================================================================== 00:11:14.262 Total : 85333.18 333.33 0.00 0.00 23620.07 602.93 39426.46 00:11:14.262 00:11:14.262 real 0m2.098s 00:11:14.262 user 0m1.738s 00:11:14.262 sys 0m0.313s 00:11:14.262 07:16:46 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:14.262 07:16:46 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:11:14.262 ************************************ 00:11:14.262 END TEST bdev_write_zeroes 00:11:14.262 ************************************ 00:11:14.262 07:16:46 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:14.262 07:16:46 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:14.262 07:16:46 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:14.262 07:16:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:14.262 ************************************ 00:11:14.262 START TEST bdev_json_nonenclosed 00:11:14.262 ************************************ 00:11:14.262 07:16:46 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:14.522 [2024-07-25 07:16:46.843974] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:14.522 [2024-07-25 07:16:46.844027] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573886 ] 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:14.522 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.522 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:14.522 [2024-07-25 07:16:46.974841] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:14.782 [2024-07-25 07:16:47.058301] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:14.782 [2024-07-25 07:16:47.058360] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:11:14.782 [2024-07-25 07:16:47.058375] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:11:14.782 [2024-07-25 07:16:47.058386] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:14.782 00:11:14.782 real 0m0.357s 00:11:14.782 user 0m0.203s 00:11:14.782 sys 0m0.151s 00:11:14.782 07:16:47 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:14.782 07:16:47 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:11:14.782 ************************************ 00:11:14.782 END TEST bdev_json_nonenclosed 00:11:14.782 ************************************ 00:11:14.782 07:16:47 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:14.782 07:16:47 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:14.782 07:16:47 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:14.782 07:16:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:14.782 ************************************ 00:11:14.782 START TEST bdev_json_nonarray 00:11:14.782 ************************************ 00:11:14.782 07:16:47 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:14.782 [2024-07-25 07:16:47.286878] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:14.782 [2024-07-25 07:16:47.286931] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1574006 ] 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:15.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.042 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:15.042 [2024-07-25 07:16:47.419377] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:15.042 [2024-07-25 07:16:47.502253] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:15.042 [2024-07-25 07:16:47.502325] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:11:15.042 [2024-07-25 07:16:47.502341] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:11:15.042 [2024-07-25 07:16:47.502352] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:15.302 00:11:15.302 real 0m0.360s 00:11:15.302 user 0m0.200s 00:11:15.302 sys 0m0.157s 00:11:15.302 07:16:47 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:15.302 07:16:47 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:11:15.302 ************************************ 00:11:15.302 END TEST bdev_json_nonarray 00:11:15.302 ************************************ 00:11:15.302 07:16:47 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:11:15.302 07:16:47 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:11:15.302 07:16:47 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:15.302 07:16:47 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:15.302 07:16:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:15.302 ************************************ 00:11:15.302 START TEST bdev_qos 00:11:15.302 ************************************ 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # qos_test_suite '' 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=1574178 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 1574178' 00:11:15.302 Process qos testing pid: 1574178 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 1574178 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # '[' -z 1574178 ']' 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:15.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:15.302 07:16:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:11:15.302 [2024-07-25 07:16:47.721529] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:15.302 [2024-07-25 07:16:47.721587] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1574178 ] 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:15.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:15.302 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:15.561 [2024-07-25 07:16:47.842151] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:15.561 [2024-07-25 07:16:47.926642] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@864 -- # return 0 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.497 Malloc_0 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_0 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.497 [ 00:11:16.497 { 00:11:16.497 "name": "Malloc_0", 00:11:16.497 "aliases": [ 00:11:16.497 "05c217e8-80cb-43fd-8c43-8778d93ac50d" 00:11:16.497 ], 00:11:16.497 "product_name": "Malloc disk", 00:11:16.497 "block_size": 512, 00:11:16.497 "num_blocks": 262144, 00:11:16.497 "uuid": "05c217e8-80cb-43fd-8c43-8778d93ac50d", 00:11:16.497 "assigned_rate_limits": { 00:11:16.497 "rw_ios_per_sec": 0, 00:11:16.497 "rw_mbytes_per_sec": 0, 00:11:16.497 "r_mbytes_per_sec": 0, 00:11:16.497 "w_mbytes_per_sec": 0 00:11:16.497 }, 00:11:16.497 "claimed": false, 00:11:16.497 "zoned": false, 00:11:16.497 "supported_io_types": { 00:11:16.497 "read": true, 00:11:16.497 "write": true, 00:11:16.497 "unmap": true, 00:11:16.497 "flush": true, 00:11:16.497 "reset": true, 00:11:16.497 "nvme_admin": false, 00:11:16.497 "nvme_io": false, 00:11:16.497 "nvme_io_md": false, 00:11:16.497 "write_zeroes": true, 00:11:16.497 "zcopy": true, 00:11:16.497 "get_zone_info": false, 00:11:16.497 "zone_management": false, 00:11:16.497 "zone_append": false, 00:11:16.497 "compare": false, 00:11:16.497 "compare_and_write": false, 00:11:16.497 "abort": true, 00:11:16.497 "seek_hole": false, 00:11:16.497 "seek_data": false, 00:11:16.497 "copy": true, 00:11:16.497 "nvme_iov_md": false 00:11:16.497 }, 00:11:16.497 "memory_domains": [ 00:11:16.497 { 00:11:16.497 "dma_device_id": "system", 00:11:16.497 "dma_device_type": 1 00:11:16.497 }, 00:11:16.497 { 00:11:16.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.497 "dma_device_type": 2 00:11:16.497 } 00:11:16.497 ], 00:11:16.497 "driver_specific": {} 00:11:16.497 } 00:11:16.497 ] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.497 Null_1 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Null_1 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:16.497 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.498 [ 00:11:16.498 { 00:11:16.498 "name": "Null_1", 00:11:16.498 "aliases": [ 00:11:16.498 "2ee52221-aead-4dea-8aa1-c8b1dff74b63" 00:11:16.498 ], 00:11:16.498 "product_name": "Null disk", 00:11:16.498 "block_size": 512, 00:11:16.498 "num_blocks": 262144, 00:11:16.498 "uuid": "2ee52221-aead-4dea-8aa1-c8b1dff74b63", 00:11:16.498 "assigned_rate_limits": { 00:11:16.498 "rw_ios_per_sec": 0, 00:11:16.498 "rw_mbytes_per_sec": 0, 00:11:16.498 "r_mbytes_per_sec": 0, 00:11:16.498 "w_mbytes_per_sec": 0 00:11:16.498 }, 00:11:16.498 "claimed": false, 00:11:16.498 "zoned": false, 00:11:16.498 "supported_io_types": { 00:11:16.498 "read": true, 00:11:16.498 "write": true, 00:11:16.498 "unmap": false, 00:11:16.498 "flush": false, 00:11:16.498 "reset": true, 00:11:16.498 "nvme_admin": false, 00:11:16.498 "nvme_io": false, 00:11:16.498 "nvme_io_md": false, 00:11:16.498 "write_zeroes": true, 00:11:16.498 "zcopy": false, 00:11:16.498 "get_zone_info": false, 00:11:16.498 "zone_management": false, 00:11:16.498 "zone_append": false, 00:11:16.498 "compare": false, 00:11:16.498 "compare_and_write": false, 00:11:16.498 "abort": true, 00:11:16.498 "seek_hole": false, 00:11:16.498 "seek_data": false, 00:11:16.498 "copy": false, 00:11:16.498 "nvme_iov_md": false 00:11:16.498 }, 00:11:16.498 "driver_specific": {} 00:11:16.498 } 00:11:16.498 ] 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:11:16.498 07:16:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:11:16.757 Running I/O for 60 seconds... 00:11:22.031 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 68276.21 273104.84 0.00 0.00 276480.00 0.00 0.00 ' 00:11:22.031 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=68276.21 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 68276 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=68276 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=17000 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 17000 -gt 1000 ']' 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 17000 Malloc_0 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 17000 IOPS Malloc_0 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:22.032 07:16:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:22.032 ************************************ 00:11:22.032 START TEST bdev_qos_iops 00:11:22.032 ************************************ 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # run_qos_test 17000 IOPS Malloc_0 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=17000 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:11:22.032 07:16:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 16994.05 67976.20 0.00 0.00 69020.00 0.00 0.00 ' 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=16994.05 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 16994 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=16994 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=15300 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=18700 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 16994 -lt 15300 ']' 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 16994 -gt 18700 ']' 00:11:27.379 00:11:27.379 real 0m5.228s 00:11:27.379 user 0m0.108s 00:11:27.379 sys 0m0.044s 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:27.379 07:16:59 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:11:27.379 ************************************ 00:11:27.379 END TEST bdev_qos_iops 00:11:27.379 ************************************ 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:11:27.379 07:16:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 21419.66 85678.62 0.00 0.00 87040.00 0.00 0.00 ' 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=87040.00 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 87040 00:11:32.651 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=87040 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=8 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 8 -lt 2 ']' 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:32.652 07:17:04 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:32.652 ************************************ 00:11:32.652 START TEST bdev_qos_bw 00:11:32.652 ************************************ 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # run_qos_test 8 BANDWIDTH Null_1 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=8 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:11:32.652 07:17:04 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 2047.87 8191.47 0.00 0.00 8308.00 0.00 0.00 ' 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=8308.00 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 8308 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=8308 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=8192 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=7372 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=9011 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 8308 -lt 7372 ']' 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 8308 -gt 9011 ']' 00:11:37.924 00:11:37.924 real 0m5.249s 00:11:37.924 user 0m0.102s 00:11:37.924 sys 0m0.051s 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:11:37.924 ************************************ 00:11:37.924 END TEST bdev_qos_bw 00:11:37.924 ************************************ 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:37.924 07:17:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:37.924 ************************************ 00:11:37.924 START TEST bdev_qos_ro_bw 00:11:37.924 ************************************ 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:11:37.924 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:11:37.925 07:17:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 511.87 2047.50 0.00 0.00 2060.00 0.00 0.00 ' 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2060.00 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2060 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2060 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -lt 1843 ']' 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -gt 2252 ']' 00:11:43.195 00:11:43.195 real 0m5.182s 00:11:43.195 user 0m0.108s 00:11:43.195 sys 0m0.045s 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:43.195 07:17:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:11:43.196 ************************************ 00:11:43.196 END TEST bdev_qos_ro_bw 00:11:43.196 ************************************ 00:11:43.196 07:17:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:11:43.196 07:17:15 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:43.196 07:17:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:43.454 07:17:15 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:43.454 07:17:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:11:43.454 07:17:15 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:43.454 07:17:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:43.713 00:11:43.713 Latency(us) 00:11:43.713 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:43.713 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:43.713 Malloc_0 : 26.73 23198.95 90.62 0.00 0.00 10927.88 1874.33 503316.48 00:11:43.713 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:43.713 Null_1 : 26.87 22168.79 86.60 0.00 0.00 11522.21 720.90 148478.36 00:11:43.713 =================================================================================================================== 00:11:43.713 Total : 45367.74 177.22 0.00 0.00 11219.11 720.90 503316.48 00:11:43.713 0 00:11:43.713 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 1574178 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # '[' -z 1574178 ']' 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # kill -0 1574178 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # uname 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1574178 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1574178' 00:11:43.714 killing process with pid 1574178 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@969 -- # kill 1574178 00:11:43.714 Received shutdown signal, test time was about 26.941324 seconds 00:11:43.714 00:11:43.714 Latency(us) 00:11:43.714 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:43.714 =================================================================================================================== 00:11:43.714 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:43.714 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@974 -- # wait 1574178 00:11:43.973 07:17:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:11:43.973 00:11:43.973 real 0m28.618s 00:11:43.973 user 0m29.545s 00:11:43.973 sys 0m0.843s 00:11:43.973 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:43.973 07:17:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:43.973 ************************************ 00:11:43.973 END TEST bdev_qos 00:11:43.973 ************************************ 00:11:43.973 07:17:16 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:11:43.973 07:17:16 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:43.973 07:17:16 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:43.973 07:17:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:43.973 ************************************ 00:11:43.973 START TEST bdev_qd_sampling 00:11:43.973 ************************************ 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # qd_sampling_test_suite '' 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=1579033 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 1579033' 00:11:43.973 Process bdev QD sampling period testing pid: 1579033 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 1579033 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # '[' -z 1579033 ']' 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:43.973 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:43.973 07:17:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:43.973 [2024-07-25 07:17:16.424173] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:43.973 [2024-07-25 07:17:16.424235] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579033 ] 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.973 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:43.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.974 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:43.974 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.974 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:43.974 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.974 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:43.974 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.974 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:43.974 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.974 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:43.974 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:43.974 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:44.233 [2024-07-25 07:17:16.556573] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:44.233 [2024-07-25 07:17:16.642996] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:44.233 [2024-07-25 07:17:16.643002] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:44.800 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:44.801 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@864 -- # return 0 00:11:44.801 07:17:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:11:44.801 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:44.801 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:45.060 Malloc_QD 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_QD 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # local i 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:45.060 [ 00:11:45.060 { 00:11:45.060 "name": "Malloc_QD", 00:11:45.060 "aliases": [ 00:11:45.060 "be63314d-2ff4-4c38-a69d-c70c9522a1ac" 00:11:45.060 ], 00:11:45.060 "product_name": "Malloc disk", 00:11:45.060 "block_size": 512, 00:11:45.060 "num_blocks": 262144, 00:11:45.060 "uuid": "be63314d-2ff4-4c38-a69d-c70c9522a1ac", 00:11:45.060 "assigned_rate_limits": { 00:11:45.060 "rw_ios_per_sec": 0, 00:11:45.060 "rw_mbytes_per_sec": 0, 00:11:45.060 "r_mbytes_per_sec": 0, 00:11:45.060 "w_mbytes_per_sec": 0 00:11:45.060 }, 00:11:45.060 "claimed": false, 00:11:45.060 "zoned": false, 00:11:45.060 "supported_io_types": { 00:11:45.060 "read": true, 00:11:45.060 "write": true, 00:11:45.060 "unmap": true, 00:11:45.060 "flush": true, 00:11:45.060 "reset": true, 00:11:45.060 "nvme_admin": false, 00:11:45.060 "nvme_io": false, 00:11:45.060 "nvme_io_md": false, 00:11:45.060 "write_zeroes": true, 00:11:45.060 "zcopy": true, 00:11:45.060 "get_zone_info": false, 00:11:45.060 "zone_management": false, 00:11:45.060 "zone_append": false, 00:11:45.060 "compare": false, 00:11:45.060 "compare_and_write": false, 00:11:45.060 "abort": true, 00:11:45.060 "seek_hole": false, 00:11:45.060 "seek_data": false, 00:11:45.060 "copy": true, 00:11:45.060 "nvme_iov_md": false 00:11:45.060 }, 00:11:45.060 "memory_domains": [ 00:11:45.060 { 00:11:45.060 "dma_device_id": "system", 00:11:45.060 "dma_device_type": 1 00:11:45.060 }, 00:11:45.060 { 00:11:45.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.060 "dma_device_type": 2 00:11:45.060 } 00:11:45.060 ], 00:11:45.060 "driver_specific": {} 00:11:45.060 } 00:11:45.060 ] 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@907 -- # return 0 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:11:45.060 07:17:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:45.060 Running I/O for 5 seconds... 00:11:46.966 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:11:46.967 "tick_rate": 2500000000, 00:11:46.967 "ticks": 14211056822621128, 00:11:46.967 "bdevs": [ 00:11:46.967 { 00:11:46.967 "name": "Malloc_QD", 00:11:46.967 "bytes_read": 805351936, 00:11:46.967 "num_read_ops": 196612, 00:11:46.967 "bytes_written": 0, 00:11:46.967 "num_write_ops": 0, 00:11:46.967 "bytes_unmapped": 0, 00:11:46.967 "num_unmap_ops": 0, 00:11:46.967 "bytes_copied": 0, 00:11:46.967 "num_copy_ops": 0, 00:11:46.967 "read_latency_ticks": 2478043857226, 00:11:46.967 "max_read_latency_ticks": 15532406, 00:11:46.967 "min_read_latency_ticks": 257298, 00:11:46.967 "write_latency_ticks": 0, 00:11:46.967 "max_write_latency_ticks": 0, 00:11:46.967 "min_write_latency_ticks": 0, 00:11:46.967 "unmap_latency_ticks": 0, 00:11:46.967 "max_unmap_latency_ticks": 0, 00:11:46.967 "min_unmap_latency_ticks": 0, 00:11:46.967 "copy_latency_ticks": 0, 00:11:46.967 "max_copy_latency_ticks": 0, 00:11:46.967 "min_copy_latency_ticks": 0, 00:11:46.967 "io_error": {}, 00:11:46.967 "queue_depth_polling_period": 10, 00:11:46.967 "queue_depth": 512, 00:11:46.967 "io_time": 30, 00:11:46.967 "weighted_io_time": 15360 00:11:46.967 } 00:11:46.967 ] 00:11:46.967 }' 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:46.967 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:46.967 00:11:46.967 Latency(us) 00:11:46.967 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:46.967 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:46.967 Malloc_QD : 2.01 50478.83 197.18 0.00 0.00 5059.18 1330.38 5347.74 00:11:46.967 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:46.967 Malloc_QD : 2.01 50954.86 199.04 0.00 0.00 5012.37 871.63 6239.03 00:11:46.967 =================================================================================================================== 00:11:46.967 Total : 101433.68 396.23 0.00 0.00 5035.65 871.63 6239.03 00:11:47.226 0 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 1579033 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # '[' -z 1579033 ']' 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # kill -0 1579033 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # uname 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1579033 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1579033' 00:11:47.226 killing process with pid 1579033 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@969 -- # kill 1579033 00:11:47.226 Received shutdown signal, test time was about 2.099760 seconds 00:11:47.226 00:11:47.226 Latency(us) 00:11:47.226 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:47.226 =================================================================================================================== 00:11:47.226 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:47.226 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@974 -- # wait 1579033 00:11:47.486 07:17:19 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:11:47.486 00:11:47.486 real 0m3.403s 00:11:47.486 user 0m6.680s 00:11:47.486 sys 0m0.408s 00:11:47.486 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:47.486 07:17:19 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:47.486 ************************************ 00:11:47.486 END TEST bdev_qd_sampling 00:11:47.486 ************************************ 00:11:47.486 07:17:19 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:11:47.486 07:17:19 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:47.486 07:17:19 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:47.486 07:17:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:47.486 ************************************ 00:11:47.486 START TEST bdev_error 00:11:47.486 ************************************ 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # error_test_suite '' 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=1579605 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 1579605' 00:11:47.487 Process error testing pid: 1579605 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:11:47.487 07:17:19 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 1579605 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 1579605 ']' 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:47.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:47.487 07:17:19 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:47.487 [2024-07-25 07:17:19.910176] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:47.487 [2024-07-25 07:17:19.910230] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579605 ] 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:47.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:47.487 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:47.747 [2024-07-25 07:17:20.031472] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:47.747 [2024-07-25 07:17:20.128083] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:48.315 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:48.315 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:11:48.315 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:48.315 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.315 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 Dev_1 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 [ 00:11:48.575 { 00:11:48.575 "name": "Dev_1", 00:11:48.575 "aliases": [ 00:11:48.575 "2cba98d6-fc58-4c3d-b576-a2759df43c61" 00:11:48.575 ], 00:11:48.575 "product_name": "Malloc disk", 00:11:48.575 "block_size": 512, 00:11:48.575 "num_blocks": 262144, 00:11:48.575 "uuid": "2cba98d6-fc58-4c3d-b576-a2759df43c61", 00:11:48.575 "assigned_rate_limits": { 00:11:48.575 "rw_ios_per_sec": 0, 00:11:48.575 "rw_mbytes_per_sec": 0, 00:11:48.575 "r_mbytes_per_sec": 0, 00:11:48.575 "w_mbytes_per_sec": 0 00:11:48.575 }, 00:11:48.575 "claimed": false, 00:11:48.575 "zoned": false, 00:11:48.575 "supported_io_types": { 00:11:48.575 "read": true, 00:11:48.575 "write": true, 00:11:48.575 "unmap": true, 00:11:48.575 "flush": true, 00:11:48.575 "reset": true, 00:11:48.575 "nvme_admin": false, 00:11:48.575 "nvme_io": false, 00:11:48.575 "nvme_io_md": false, 00:11:48.575 "write_zeroes": true, 00:11:48.575 "zcopy": true, 00:11:48.575 "get_zone_info": false, 00:11:48.575 "zone_management": false, 00:11:48.575 "zone_append": false, 00:11:48.575 "compare": false, 00:11:48.575 "compare_and_write": false, 00:11:48.575 "abort": true, 00:11:48.575 "seek_hole": false, 00:11:48.575 "seek_data": false, 00:11:48.575 "copy": true, 00:11:48.575 "nvme_iov_md": false 00:11:48.575 }, 00:11:48.575 "memory_domains": [ 00:11:48.575 { 00:11:48.575 "dma_device_id": "system", 00:11:48.575 "dma_device_type": 1 00:11:48.575 }, 00:11:48.575 { 00:11:48.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.575 "dma_device_type": 2 00:11:48.575 } 00:11:48.575 ], 00:11:48.575 "driver_specific": {} 00:11:48.575 } 00:11:48.575 ] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 true 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 Dev_2 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 [ 00:11:48.575 { 00:11:48.575 "name": "Dev_2", 00:11:48.575 "aliases": [ 00:11:48.575 "24970fb2-815f-4b58-a127-2cbdb8de06d7" 00:11:48.575 ], 00:11:48.575 "product_name": "Malloc disk", 00:11:48.575 "block_size": 512, 00:11:48.575 "num_blocks": 262144, 00:11:48.575 "uuid": "24970fb2-815f-4b58-a127-2cbdb8de06d7", 00:11:48.575 "assigned_rate_limits": { 00:11:48.575 "rw_ios_per_sec": 0, 00:11:48.575 "rw_mbytes_per_sec": 0, 00:11:48.575 "r_mbytes_per_sec": 0, 00:11:48.575 "w_mbytes_per_sec": 0 00:11:48.575 }, 00:11:48.575 "claimed": false, 00:11:48.575 "zoned": false, 00:11:48.575 "supported_io_types": { 00:11:48.575 "read": true, 00:11:48.575 "write": true, 00:11:48.575 "unmap": true, 00:11:48.575 "flush": true, 00:11:48.575 "reset": true, 00:11:48.575 "nvme_admin": false, 00:11:48.575 "nvme_io": false, 00:11:48.575 "nvme_io_md": false, 00:11:48.575 "write_zeroes": true, 00:11:48.575 "zcopy": true, 00:11:48.575 "get_zone_info": false, 00:11:48.575 "zone_management": false, 00:11:48.575 "zone_append": false, 00:11:48.575 "compare": false, 00:11:48.575 "compare_and_write": false, 00:11:48.575 "abort": true, 00:11:48.575 "seek_hole": false, 00:11:48.575 "seek_data": false, 00:11:48.575 "copy": true, 00:11:48.575 "nvme_iov_md": false 00:11:48.575 }, 00:11:48.575 "memory_domains": [ 00:11:48.575 { 00:11:48.575 "dma_device_id": "system", 00:11:48.575 "dma_device_type": 1 00:11:48.575 }, 00:11:48.575 { 00:11:48.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.575 "dma_device_type": 2 00:11:48.575 } 00:11:48.575 ], 00:11:48.575 "driver_specific": {} 00:11:48.575 } 00:11:48.575 ] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:48.575 07:17:20 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:11:48.575 07:17:20 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:48.575 Running I/O for 5 seconds... 00:11:49.513 07:17:21 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 1579605 00:11:49.513 07:17:21 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 1579605' 00:11:49.513 Process is existed as continue on error is set. Pid: 1579605 00:11:49.513 07:17:21 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:11:49.513 07:17:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:49.513 07:17:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:49.513 07:17:22 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:49.513 07:17:22 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:11:49.513 07:17:22 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:49.513 07:17:22 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:49.513 07:17:22 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:49.513 07:17:22 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:11:49.773 Timeout while waiting for response: 00:11:49.773 00:11:49.773 00:11:54.034 00:11:54.034 Latency(us) 00:11:54.034 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:54.034 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:54.034 EE_Dev_1 : 0.91 40651.18 158.79 5.52 0.00 390.37 117.96 635.70 00:11:54.034 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:54.034 Dev_2 : 5.00 88100.12 344.14 0.00 0.00 178.43 70.04 18874.37 00:11:54.034 =================================================================================================================== 00:11:54.034 Total : 128751.30 502.93 5.52 0.00 194.78 70.04 18874.37 00:11:54.602 07:17:27 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 1579605 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # '[' -z 1579605 ']' 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # kill -0 1579605 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # uname 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1579605 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1579605' 00:11:54.602 killing process with pid 1579605 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@969 -- # kill 1579605 00:11:54.602 Received shutdown signal, test time was about 5.000000 seconds 00:11:54.602 00:11:54.602 Latency(us) 00:11:54.602 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:54.602 =================================================================================================================== 00:11:54.602 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:54.602 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@974 -- # wait 1579605 00:11:54.861 07:17:27 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=1580922 00:11:54.861 07:17:27 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:11:54.861 07:17:27 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 1580922' 00:11:54.861 Process error testing pid: 1580922 00:11:54.861 07:17:27 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 1580922 00:11:54.861 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 1580922 ']' 00:11:54.861 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:54.861 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:54.861 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:54.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:54.861 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:54.861 07:17:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:54.861 [2024-07-25 07:17:27.378407] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:54.861 [2024-07-25 07:17:27.378470] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1580922 ] 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:55.120 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:55.120 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:55.120 [2024-07-25 07:17:27.497962] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:55.120 [2024-07-25 07:17:27.584332] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:11:56.058 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.058 Dev_1 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.058 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.058 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.058 [ 00:11:56.058 { 00:11:56.058 "name": "Dev_1", 00:11:56.058 "aliases": [ 00:11:56.058 "41db2031-0437-4a2c-8a66-f747d4288e2c" 00:11:56.058 ], 00:11:56.058 "product_name": "Malloc disk", 00:11:56.058 "block_size": 512, 00:11:56.058 "num_blocks": 262144, 00:11:56.058 "uuid": "41db2031-0437-4a2c-8a66-f747d4288e2c", 00:11:56.058 "assigned_rate_limits": { 00:11:56.059 "rw_ios_per_sec": 0, 00:11:56.059 "rw_mbytes_per_sec": 0, 00:11:56.059 "r_mbytes_per_sec": 0, 00:11:56.059 "w_mbytes_per_sec": 0 00:11:56.059 }, 00:11:56.059 "claimed": false, 00:11:56.059 "zoned": false, 00:11:56.059 "supported_io_types": { 00:11:56.059 "read": true, 00:11:56.059 "write": true, 00:11:56.059 "unmap": true, 00:11:56.059 "flush": true, 00:11:56.059 "reset": true, 00:11:56.059 "nvme_admin": false, 00:11:56.059 "nvme_io": false, 00:11:56.059 "nvme_io_md": false, 00:11:56.059 "write_zeroes": true, 00:11:56.059 "zcopy": true, 00:11:56.059 "get_zone_info": false, 00:11:56.059 "zone_management": false, 00:11:56.059 "zone_append": false, 00:11:56.059 "compare": false, 00:11:56.059 "compare_and_write": false, 00:11:56.059 "abort": true, 00:11:56.059 "seek_hole": false, 00:11:56.059 "seek_data": false, 00:11:56.059 "copy": true, 00:11:56.059 "nvme_iov_md": false 00:11:56.059 }, 00:11:56.059 "memory_domains": [ 00:11:56.059 { 00:11:56.059 "dma_device_id": "system", 00:11:56.059 "dma_device_type": 1 00:11:56.059 }, 00:11:56.059 { 00:11:56.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.059 "dma_device_type": 2 00:11:56.059 } 00:11:56.059 ], 00:11:56.059 "driver_specific": {} 00:11:56.059 } 00:11:56.059 ] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:11:56.059 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.059 true 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.059 Dev_2 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.059 [ 00:11:56.059 { 00:11:56.059 "name": "Dev_2", 00:11:56.059 "aliases": [ 00:11:56.059 "2713d962-ed64-48f4-9ca8-37199cf5d870" 00:11:56.059 ], 00:11:56.059 "product_name": "Malloc disk", 00:11:56.059 "block_size": 512, 00:11:56.059 "num_blocks": 262144, 00:11:56.059 "uuid": "2713d962-ed64-48f4-9ca8-37199cf5d870", 00:11:56.059 "assigned_rate_limits": { 00:11:56.059 "rw_ios_per_sec": 0, 00:11:56.059 "rw_mbytes_per_sec": 0, 00:11:56.059 "r_mbytes_per_sec": 0, 00:11:56.059 "w_mbytes_per_sec": 0 00:11:56.059 }, 00:11:56.059 "claimed": false, 00:11:56.059 "zoned": false, 00:11:56.059 "supported_io_types": { 00:11:56.059 "read": true, 00:11:56.059 "write": true, 00:11:56.059 "unmap": true, 00:11:56.059 "flush": true, 00:11:56.059 "reset": true, 00:11:56.059 "nvme_admin": false, 00:11:56.059 "nvme_io": false, 00:11:56.059 "nvme_io_md": false, 00:11:56.059 "write_zeroes": true, 00:11:56.059 "zcopy": true, 00:11:56.059 "get_zone_info": false, 00:11:56.059 "zone_management": false, 00:11:56.059 "zone_append": false, 00:11:56.059 "compare": false, 00:11:56.059 "compare_and_write": false, 00:11:56.059 "abort": true, 00:11:56.059 "seek_hole": false, 00:11:56.059 "seek_data": false, 00:11:56.059 "copy": true, 00:11:56.059 "nvme_iov_md": false 00:11:56.059 }, 00:11:56.059 "memory_domains": [ 00:11:56.059 { 00:11:56.059 "dma_device_id": "system", 00:11:56.059 "dma_device_type": 1 00:11:56.059 }, 00:11:56.059 { 00:11:56.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.059 "dma_device_type": 2 00:11:56.059 } 00:11:56.059 ], 00:11:56.059 "driver_specific": {} 00:11:56.059 } 00:11:56.059 ] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:11:56.059 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:56.059 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 1580922 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # local es=0 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # valid_exec_arg wait 1580922 00:11:56.059 07:17:28 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@638 -- # local arg=wait 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # type -t wait 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:56.059 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # wait 1580922 00:11:56.059 Running I/O for 5 seconds... 00:11:56.059 task offset: 14024 on job bdev=EE_Dev_1 fails 00:11:56.059 00:11:56.059 Latency(us) 00:11:56.059 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:56.059 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:56.059 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:11:56.059 EE_Dev_1 : 0.00 31837.92 124.37 7235.89 0.00 342.20 120.42 602.93 00:11:56.059 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:56.059 Dev_2 : 0.00 19826.52 77.45 0.00 0.00 605.49 116.33 1127.22 00:11:56.059 =================================================================================================================== 00:11:56.059 Total : 51664.43 201.81 7235.89 0.00 485.00 116.33 1127.22 00:11:56.059 [2024-07-25 07:17:28.532695] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:56.059 request: 00:11:56.059 { 00:11:56.059 "method": "perform_tests", 00:11:56.059 "req_id": 1 00:11:56.059 } 00:11:56.059 Got JSON-RPC error response 00:11:56.059 response: 00:11:56.059 { 00:11:56.059 "code": -32603, 00:11:56.059 "message": "bdevperf failed with error Operation not permitted" 00:11:56.059 } 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # es=255 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # es=127 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@663 -- # case "$es" in 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@670 -- # es=1 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:11:56.319 00:11:56.319 real 0m8.933s 00:11:56.319 user 0m9.351s 00:11:56.319 sys 0m0.834s 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:56.319 07:17:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:56.319 ************************************ 00:11:56.319 END TEST bdev_error 00:11:56.319 ************************************ 00:11:56.319 07:17:28 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:11:56.319 07:17:28 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:56.319 07:17:28 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:56.319 07:17:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:56.577 ************************************ 00:11:56.577 START TEST bdev_stat 00:11:56.578 ************************************ 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # stat_test_suite '' 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=1581216 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 1581216' 00:11:56.578 Process Bdev IO statistics testing pid: 1581216 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 1581216 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # '[' -z 1581216 ']' 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:56.578 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:56.578 07:17:28 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:56.578 [2024-07-25 07:17:28.924025] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:11:56.578 [2024-07-25 07:17:28.924078] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581216 ] 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:56.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:56.578 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:56.578 [2024-07-25 07:17:29.053684] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:56.837 [2024-07-25 07:17:29.143109] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:56.837 [2024-07-25 07:17:29.143115] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@864 -- # return 0 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:57.404 Malloc_STAT 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_STAT 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # local i 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:57.404 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:57.404 [ 00:11:57.404 { 00:11:57.404 "name": "Malloc_STAT", 00:11:57.404 "aliases": [ 00:11:57.404 "3fd55d16-cfd0-4cc2-8194-e36053d2439e" 00:11:57.404 ], 00:11:57.404 "product_name": "Malloc disk", 00:11:57.404 "block_size": 512, 00:11:57.404 "num_blocks": 262144, 00:11:57.404 "uuid": "3fd55d16-cfd0-4cc2-8194-e36053d2439e", 00:11:57.404 "assigned_rate_limits": { 00:11:57.404 "rw_ios_per_sec": 0, 00:11:57.404 "rw_mbytes_per_sec": 0, 00:11:57.404 "r_mbytes_per_sec": 0, 00:11:57.404 "w_mbytes_per_sec": 0 00:11:57.404 }, 00:11:57.404 "claimed": false, 00:11:57.404 "zoned": false, 00:11:57.404 "supported_io_types": { 00:11:57.404 "read": true, 00:11:57.404 "write": true, 00:11:57.404 "unmap": true, 00:11:57.404 "flush": true, 00:11:57.404 "reset": true, 00:11:57.404 "nvme_admin": false, 00:11:57.404 "nvme_io": false, 00:11:57.404 "nvme_io_md": false, 00:11:57.404 "write_zeroes": true, 00:11:57.404 "zcopy": true, 00:11:57.404 "get_zone_info": false, 00:11:57.404 "zone_management": false, 00:11:57.404 "zone_append": false, 00:11:57.404 "compare": false, 00:11:57.405 "compare_and_write": false, 00:11:57.405 "abort": true, 00:11:57.405 "seek_hole": false, 00:11:57.405 "seek_data": false, 00:11:57.405 "copy": true, 00:11:57.405 "nvme_iov_md": false 00:11:57.405 }, 00:11:57.405 "memory_domains": [ 00:11:57.405 { 00:11:57.405 "dma_device_id": "system", 00:11:57.405 "dma_device_type": 1 00:11:57.405 }, 00:11:57.405 { 00:11:57.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.405 "dma_device_type": 2 00:11:57.405 } 00:11:57.405 ], 00:11:57.405 "driver_specific": {} 00:11:57.405 } 00:11:57.405 ] 00:11:57.405 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:57.405 07:17:29 blockdev_general.bdev_stat -- common/autotest_common.sh@907 -- # return 0 00:11:57.405 07:17:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:11:57.405 07:17:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:57.663 Running I/O for 10 seconds... 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:11:59.571 "tick_rate": 2500000000, 00:11:59.571 "ticks": 14211087999343372, 00:11:59.571 "bdevs": [ 00:11:59.571 { 00:11:59.571 "name": "Malloc_STAT", 00:11:59.571 "bytes_read": 792769024, 00:11:59.571 "num_read_ops": 193540, 00:11:59.571 "bytes_written": 0, 00:11:59.571 "num_write_ops": 0, 00:11:59.571 "bytes_unmapped": 0, 00:11:59.571 "num_unmap_ops": 0, 00:11:59.571 "bytes_copied": 0, 00:11:59.571 "num_copy_ops": 0, 00:11:59.571 "read_latency_ticks": 2435586190128, 00:11:59.571 "max_read_latency_ticks": 15003808, 00:11:59.571 "min_read_latency_ticks": 245910, 00:11:59.571 "write_latency_ticks": 0, 00:11:59.571 "max_write_latency_ticks": 0, 00:11:59.571 "min_write_latency_ticks": 0, 00:11:59.571 "unmap_latency_ticks": 0, 00:11:59.571 "max_unmap_latency_ticks": 0, 00:11:59.571 "min_unmap_latency_ticks": 0, 00:11:59.571 "copy_latency_ticks": 0, 00:11:59.571 "max_copy_latency_ticks": 0, 00:11:59.571 "min_copy_latency_ticks": 0, 00:11:59.571 "io_error": {} 00:11:59.571 } 00:11:59.571 ] 00:11:59.571 }' 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=193540 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:59.571 07:17:31 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:11:59.571 "tick_rate": 2500000000, 00:11:59.571 "ticks": 14211088164537256, 00:11:59.571 "name": "Malloc_STAT", 00:11:59.571 "channels": [ 00:11:59.571 { 00:11:59.571 "thread_id": 2, 00:11:59.571 "bytes_read": 407896064, 00:11:59.571 "num_read_ops": 99584, 00:11:59.571 "bytes_written": 0, 00:11:59.571 "num_write_ops": 0, 00:11:59.571 "bytes_unmapped": 0, 00:11:59.571 "num_unmap_ops": 0, 00:11:59.571 "bytes_copied": 0, 00:11:59.571 "num_copy_ops": 0, 00:11:59.571 "read_latency_ticks": 1258994440806, 00:11:59.571 "max_read_latency_ticks": 13772090, 00:11:59.571 "min_read_latency_ticks": 8436840, 00:11:59.571 "write_latency_ticks": 0, 00:11:59.571 "max_write_latency_ticks": 0, 00:11:59.571 "min_write_latency_ticks": 0, 00:11:59.571 "unmap_latency_ticks": 0, 00:11:59.571 "max_unmap_latency_ticks": 0, 00:11:59.571 "min_unmap_latency_ticks": 0, 00:11:59.571 "copy_latency_ticks": 0, 00:11:59.571 "max_copy_latency_ticks": 0, 00:11:59.571 "min_copy_latency_ticks": 0 00:11:59.571 }, 00:11:59.571 { 00:11:59.571 "thread_id": 3, 00:11:59.571 "bytes_read": 412090368, 00:11:59.571 "num_read_ops": 100608, 00:11:59.571 "bytes_written": 0, 00:11:59.571 "num_write_ops": 0, 00:11:59.571 "bytes_unmapped": 0, 00:11:59.571 "num_unmap_ops": 0, 00:11:59.571 "bytes_copied": 0, 00:11:59.571 "num_copy_ops": 0, 00:11:59.571 "read_latency_ticks": 1260431304232, 00:11:59.571 "max_read_latency_ticks": 15003808, 00:11:59.571 "min_read_latency_ticks": 8566066, 00:11:59.571 "write_latency_ticks": 0, 00:11:59.571 "max_write_latency_ticks": 0, 00:11:59.571 "min_write_latency_ticks": 0, 00:11:59.571 "unmap_latency_ticks": 0, 00:11:59.571 "max_unmap_latency_ticks": 0, 00:11:59.571 "min_unmap_latency_ticks": 0, 00:11:59.571 "copy_latency_ticks": 0, 00:11:59.571 "max_copy_latency_ticks": 0, 00:11:59.571 "min_copy_latency_ticks": 0 00:11:59.571 } 00:11:59.571 ] 00:11:59.571 }' 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=99584 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=99584 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=100608 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=200192 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:59.571 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:11:59.831 "tick_rate": 2500000000, 00:11:59.831 "ticks": 14211088468729450, 00:11:59.831 "bdevs": [ 00:11:59.831 { 00:11:59.831 "name": "Malloc_STAT", 00:11:59.831 "bytes_read": 870363648, 00:11:59.831 "num_read_ops": 212484, 00:11:59.831 "bytes_written": 0, 00:11:59.831 "num_write_ops": 0, 00:11:59.831 "bytes_unmapped": 0, 00:11:59.831 "num_unmap_ops": 0, 00:11:59.831 "bytes_copied": 0, 00:11:59.831 "num_copy_ops": 0, 00:11:59.831 "read_latency_ticks": 2674114707368, 00:11:59.831 "max_read_latency_ticks": 15003808, 00:11:59.831 "min_read_latency_ticks": 245910, 00:11:59.831 "write_latency_ticks": 0, 00:11:59.831 "max_write_latency_ticks": 0, 00:11:59.831 "min_write_latency_ticks": 0, 00:11:59.831 "unmap_latency_ticks": 0, 00:11:59.831 "max_unmap_latency_ticks": 0, 00:11:59.831 "min_unmap_latency_ticks": 0, 00:11:59.831 "copy_latency_ticks": 0, 00:11:59.831 "max_copy_latency_ticks": 0, 00:11:59.831 "min_copy_latency_ticks": 0, 00:11:59.831 "io_error": {} 00:11:59.831 } 00:11:59.831 ] 00:11:59.831 }' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=212484 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 200192 -lt 193540 ']' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 200192 -gt 212484 ']' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:59.831 00:11:59.831 Latency(us) 00:11:59.831 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:59.831 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:59.831 Malloc_STAT : 2.17 50531.94 197.39 0.00 0.00 5053.68 1756.36 5531.24 00:11:59.831 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:59.831 Malloc_STAT : 2.17 51003.10 199.23 0.00 0.00 5006.99 1717.04 6003.10 00:11:59.831 =================================================================================================================== 00:11:59.831 Total : 101535.04 396.62 0.00 0.00 5030.23 1717.04 6003.10 00:11:59.831 0 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 1581216 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # '[' -z 1581216 ']' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # kill -0 1581216 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # uname 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1581216 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1581216' 00:11:59.831 killing process with pid 1581216 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@969 -- # kill 1581216 00:11:59.831 Received shutdown signal, test time was about 2.254377 seconds 00:11:59.831 00:11:59.831 Latency(us) 00:11:59.831 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:59.831 =================================================================================================================== 00:11:59.831 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:59.831 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@974 -- # wait 1581216 00:12:00.091 07:17:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:12:00.091 00:12:00.091 real 0m3.590s 00:12:00.091 user 0m7.208s 00:12:00.091 sys 0m0.448s 00:12:00.091 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:00.091 07:17:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:00.091 ************************************ 00:12:00.091 END TEST bdev_stat 00:12:00.091 ************************************ 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:12:00.091 07:17:32 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:12:00.091 00:12:00.091 real 1m54.028s 00:12:00.091 user 7m23.891s 00:12:00.091 sys 0m21.777s 00:12:00.091 07:17:32 blockdev_general -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:00.091 07:17:32 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:00.091 ************************************ 00:12:00.091 END TEST blockdev_general 00:12:00.091 ************************************ 00:12:00.091 07:17:32 -- spdk/autotest.sh@194 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:12:00.091 07:17:32 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:12:00.091 07:17:32 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:00.091 07:17:32 -- common/autotest_common.sh@10 -- # set +x 00:12:00.091 ************************************ 00:12:00.091 START TEST bdev_raid 00:12:00.091 ************************************ 00:12:00.091 07:17:32 bdev_raid -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:12:00.350 * Looking for test storage... 00:12:00.350 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:12:00.350 07:17:32 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:12:00.350 07:17:32 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:12:00.350 07:17:32 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:12:00.350 07:17:32 bdev_raid -- bdev/bdev_raid.sh@927 -- # mkdir -p /raidtest 00:12:00.350 07:17:32 bdev_raid -- bdev/bdev_raid.sh@928 -- # trap 'cleanup; exit 1' EXIT 00:12:00.350 07:17:32 bdev_raid -- bdev/bdev_raid.sh@930 -- # base_blocklen=512 00:12:00.350 07:17:32 bdev_raid -- bdev/bdev_raid.sh@932 -- # run_test raid0_resize_superblock_test raid_resize_superblock_test 0 00:12:00.350 07:17:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:00.350 07:17:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:00.350 07:17:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:00.350 ************************************ 00:12:00.350 START TEST raid0_resize_superblock_test 00:12:00.350 ************************************ 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 0 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=0 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=1582002 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 1582002' 00:12:00.350 Process raid pid: 1582002 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 1582002 /var/tmp/spdk-raid.sock 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1582002 ']' 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:00.350 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:00.350 07:17:32 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.350 [2024-07-25 07:17:32.821734] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:00.350 [2024-07-25 07:17:32.821795] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:00.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.610 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:00.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.611 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:00.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:00.611 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:00.611 [2024-07-25 07:17:32.955387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:00.611 [2024-07-25 07:17:33.040923] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:00.611 [2024-07-25 07:17:33.095945] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:00.611 [2024-07-25 07:17:33.095970] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:01.547 07:17:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:01.547 07:17:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:01.547 07:17:33 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:12:01.547 malloc0 00:12:01.547 07:17:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:12:01.806 [2024-07-25 07:17:34.267052] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:12:01.806 [2024-07-25 07:17:34.267102] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:01.806 [2024-07-25 07:17:34.267121] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa5be0 00:12:01.806 [2024-07-25 07:17:34.267132] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:01.806 [2024-07-25 07:17:34.268593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:01.806 [2024-07-25 07:17:34.268621] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:12:01.806 pt0 00:12:01.806 07:17:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:12:02.065 a5fe71f3-4faf-4743-8ac7-5ed09642ba89 00:12:02.065 07:17:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:12:02.323 207a9def-aa93-4959-8ae9-a07cd6156c87 00:12:02.323 07:17:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:12:02.582 681f046c-d3b1-4287-8914-0e4e0d838e46 00:12:02.582 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:12:02.582 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@884 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 0 -z 64 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:12:02.840 [2024-07-25 07:17:35.256970] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 207a9def-aa93-4959-8ae9-a07cd6156c87 is claimed 00:12:02.840 [2024-07-25 07:17:35.257045] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 681f046c-d3b1-4287-8914-0e4e0d838e46 is claimed 00:12:02.840 [2024-07-25 07:17:35.257183] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2150510 00:12:02.840 [2024-07-25 07:17:35.257194] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 245760, blocklen 512 00:12:02.840 [2024-07-25 07:17:35.257367] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa5300 00:12:02.840 [2024-07-25 07:17:35.257515] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2150510 00:12:02.840 [2024-07-25 07:17:35.257524] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2150510 00:12:02.840 [2024-07-25 07:17:35.257620] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.840 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:12:02.840 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:12:03.098 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:12:03.098 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:12:03.098 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:12:03.357 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:12:03.357 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:03.357 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:03.357 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:03.357 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # jq '.[].num_blocks' 00:12:03.616 [2024-07-25 07:17:35.950968] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.616 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:03.616 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:03.616 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # (( 245760 == 245760 )) 00:12:03.616 07:17:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:12:04.183 [2024-07-25 07:17:36.452230] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:04.183 [2024-07-25 07:17:36.452254] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '207a9def-aa93-4959-8ae9-a07cd6156c87' was resized: old size 131072, new size 204800 00:12:04.183 07:17:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:12:04.183 [2024-07-25 07:17:36.692814] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:04.183 [2024-07-25 07:17:36.692833] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '681f046c-d3b1-4287-8914-0e4e0d838e46' was resized: old size 131072, new size 204800 00:12:04.183 [2024-07-25 07:17:36.692855] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 245760 to 393216 00:12:04.183 07:17:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:12:04.183 07:17:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:12:04.750 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:12:04.750 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:12:04.750 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:12:05.009 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:12:05.009 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:05.009 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:05.009 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:05.009 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # jq '.[].num_blocks' 00:12:05.267 [2024-07-25 07:17:37.647436] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:05.267 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:05.267 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:05.267 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # (( 393216 == 393216 )) 00:12:05.267 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:12:05.526 [2024-07-25 07:17:37.875856] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:12:05.526 [2024-07-25 07:17:37.875917] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:12:05.526 [2024-07-25 07:17:37.875927] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:05.526 [2024-07-25 07:17:37.875939] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:12:05.526 [2024-07-25 07:17:37.876020] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:05.526 [2024-07-25 07:17:37.876050] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:05.526 [2024-07-25 07:17:37.876061] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2150510 name Raid, state offline 00:12:05.526 07:17:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:12:05.785 [2024-07-25 07:17:38.104418] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:12:05.785 [2024-07-25 07:17:38.104457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:05.785 [2024-07-25 07:17:38.104475] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x214f4c0 00:12:05.785 [2024-07-25 07:17:38.104486] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:05.785 [2024-07-25 07:17:38.105972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:05.785 [2024-07-25 07:17:38.106000] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:12:05.785 [2024-07-25 07:17:38.107120] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 207a9def-aa93-4959-8ae9-a07cd6156c87 00:12:05.785 [2024-07-25 07:17:38.107165] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 207a9def-aa93-4959-8ae9-a07cd6156c87 is claimed 00:12:05.785 [2024-07-25 07:17:38.107248] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 681f046c-d3b1-4287-8914-0e4e0d838e46 00:12:05.785 [2024-07-25 07:17:38.107265] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 681f046c-d3b1-4287-8914-0e4e0d838e46 is claimed 00:12:05.785 [2024-07-25 07:17:38.107372] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev 681f046c-d3b1-4287-8914-0e4e0d838e46 (2) smaller than existing raid bdev Raid (3) 00:12:05.785 [2024-07-25 07:17:38.107400] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2154430 00:12:05.785 [2024-07-25 07:17:38.107408] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 393216, blocklen 512 00:12:05.785 [2024-07-25 07:17:38.107561] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa5300 00:12:05.785 [2024-07-25 07:17:38.107693] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2154430 00:12:05.785 [2024-07-25 07:17:38.107702] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2154430 00:12:05.785 [2024-07-25 07:17:38.107802] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:05.785 pt0 00:12:05.785 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:05.785 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:05.785 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:05.785 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # jq '.[].num_blocks' 00:12:06.044 [2024-07-25 07:17:38.333255] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # (( 393216 == 393216 )) 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 1582002 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1582002 ']' 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1582002 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1582002 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1582002' 00:12:06.044 killing process with pid 1582002 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 1582002 00:12:06.044 [2024-07-25 07:17:38.407986] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:06.044 [2024-07-25 07:17:38.408029] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:06.044 [2024-07-25 07:17:38.408066] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:06.044 [2024-07-25 07:17:38.408076] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2154430 name Raid, state offline 00:12:06.044 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 1582002 00:12:06.044 [2024-07-25 07:17:38.486296] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:06.303 07:17:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:12:06.303 00:12:06.303 real 0m5.913s 00:12:06.303 user 0m9.726s 00:12:06.303 sys 0m1.214s 00:12:06.303 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:06.303 07:17:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.303 ************************************ 00:12:06.303 END TEST raid0_resize_superblock_test 00:12:06.303 ************************************ 00:12:06.304 07:17:38 bdev_raid -- bdev/bdev_raid.sh@933 -- # run_test raid1_resize_superblock_test raid_resize_superblock_test 1 00:12:06.304 07:17:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:06.304 07:17:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:06.304 07:17:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:06.304 ************************************ 00:12:06.304 START TEST raid1_resize_superblock_test 00:12:06.304 ************************************ 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 1 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=1 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=1583001 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 1583001' 00:12:06.304 Process raid pid: 1583001 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 1583001 /var/tmp/spdk-raid.sock 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1583001 ']' 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:06.304 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:06.304 07:17:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.304 [2024-07-25 07:17:38.815959] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:06.304 [2024-07-25 07:17:38.816015] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:06.563 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:06.563 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:06.563 [2024-07-25 07:17:38.952131] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:06.563 [2024-07-25 07:17:39.038644] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:06.854 [2024-07-25 07:17:39.106507] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:06.854 [2024-07-25 07:17:39.106538] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.422 07:17:39 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:07.422 07:17:39 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:07.422 07:17:39 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:12:07.680 malloc0 00:12:07.680 07:17:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:12:07.939 [2024-07-25 07:17:40.274331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:12:07.939 [2024-07-25 07:17:40.274377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.939 [2024-07-25 07:17:40.274398] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238dbe0 00:12:07.939 [2024-07-25 07:17:40.274410] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.939 [2024-07-25 07:17:40.276000] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.939 [2024-07-25 07:17:40.276028] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:12:07.939 pt0 00:12:07.939 07:17:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:12:08.198 043d94fd-5814-44c7-a6aa-244d637b8206 00:12:08.198 07:17:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:12:08.456 35db2ae3-59dc-43a7-a3f8-b1b833ac2b41 00:12:08.456 07:17:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:12:08.715 ba221d2d-2afd-40e1-a7d8-0852895295f1 00:12:08.715 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:12:08.715 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@885 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 1 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:12:08.715 [2024-07-25 07:17:41.232589] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 35db2ae3-59dc-43a7-a3f8-b1b833ac2b41 is claimed 00:12:08.715 [2024-07-25 07:17:41.232657] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev ba221d2d-2afd-40e1-a7d8-0852895295f1 is claimed 00:12:08.715 [2024-07-25 07:17:41.232791] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2538510 00:12:08.715 [2024-07-25 07:17:41.232802] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 122880, blocklen 512 00:12:08.715 [2024-07-25 07:17:41.232967] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253b030 00:12:08.715 [2024-07-25 07:17:41.233116] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2538510 00:12:08.715 [2024-07-25 07:17:41.233126] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2538510 00:12:08.715 [2024-07-25 07:17:41.233231] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:08.974 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:12:08.974 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:12:08.974 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:12:08.974 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:12:08.974 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:12:09.233 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:12:09.233 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:09.233 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:09.233 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:09.233 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # jq '.[].num_blocks' 00:12:09.491 [2024-07-25 07:17:41.902522] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:09.491 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:09.491 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:12:09.491 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # (( 122880 == 122880 )) 00:12:09.491 07:17:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:12:09.750 [2024-07-25 07:17:42.119018] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:09.750 [2024-07-25 07:17:42.119039] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '35db2ae3-59dc-43a7-a3f8-b1b833ac2b41' was resized: old size 131072, new size 204800 00:12:09.750 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:12:10.008 [2024-07-25 07:17:42.343582] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:10.008 [2024-07-25 07:17:42.343600] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'ba221d2d-2afd-40e1-a7d8-0852895295f1' was resized: old size 131072, new size 204800 00:12:10.008 [2024-07-25 07:17:42.343627] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 122880 to 196608 00:12:10.008 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:12:10.008 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:12:10.266 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:12:10.266 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:12:10.266 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # jq '.[].num_blocks' 00:12:10.524 [2024-07-25 07:17:43.021480] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:10.524 07:17:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:12:10.524 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # (( 196608 == 196608 )) 00:12:10.524 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:12:10.783 [2024-07-25 07:17:43.245878] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:12:10.783 [2024-07-25 07:17:43.245934] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:12:10.783 [2024-07-25 07:17:43.245955] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:12:10.783 [2024-07-25 07:17:43.246061] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:10.783 [2024-07-25 07:17:43.246191] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:10.783 [2024-07-25 07:17:43.246246] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:10.783 [2024-07-25 07:17:43.246258] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2538510 name Raid, state offline 00:12:10.783 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:12:11.041 [2024-07-25 07:17:43.470436] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:12:11.041 [2024-07-25 07:17:43.470471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:11.041 [2024-07-25 07:17:43.470488] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25374c0 00:12:11.041 [2024-07-25 07:17:43.470499] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:11.041 [2024-07-25 07:17:43.471971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:11.041 [2024-07-25 07:17:43.471998] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:12:11.041 [2024-07-25 07:17:43.473098] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 35db2ae3-59dc-43a7-a3f8-b1b833ac2b41 00:12:11.041 [2024-07-25 07:17:43.473132] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 35db2ae3-59dc-43a7-a3f8-b1b833ac2b41 is claimed 00:12:11.041 [2024-07-25 07:17:43.473224] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev ba221d2d-2afd-40e1-a7d8-0852895295f1 00:12:11.041 [2024-07-25 07:17:43.473246] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev ba221d2d-2afd-40e1-a7d8-0852895295f1 is claimed 00:12:11.041 [2024-07-25 07:17:43.473350] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev ba221d2d-2afd-40e1-a7d8-0852895295f1 (2) smaller than existing raid bdev Raid (3) 00:12:11.041 [2024-07-25 07:17:43.473378] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x253c4a0 00:12:11.041 [2024-07-25 07:17:43.473386] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:11.041 [2024-07-25 07:17:43.473539] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253ef20 00:12:11.041 [2024-07-25 07:17:43.473671] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x253c4a0 00:12:11.041 [2024-07-25 07:17:43.473681] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x253c4a0 00:12:11.041 [2024-07-25 07:17:43.473780] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:11.041 pt0 00:12:11.041 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:11.041 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:11.041 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:11.041 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # jq '.[].num_blocks' 00:12:11.300 [2024-07-25 07:17:43.699270] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:11.300 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:11.300 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:12:11.300 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # (( 196608 == 196608 )) 00:12:11.300 07:17:43 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 1583001 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1583001 ']' 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1583001 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1583001 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1583001' 00:12:11.301 killing process with pid 1583001 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 1583001 00:12:11.301 [2024-07-25 07:17:43.775119] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:11.301 [2024-07-25 07:17:43.775170] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:11.301 [2024-07-25 07:17:43.775213] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:11.301 [2024-07-25 07:17:43.775223] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x253c4a0 name Raid, state offline 00:12:11.301 07:17:43 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 1583001 00:12:11.560 [2024-07-25 07:17:43.855021] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:11.560 07:17:44 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:12:11.560 00:12:11.560 real 0m5.288s 00:12:11.560 user 0m8.603s 00:12:11.560 sys 0m1.092s 00:12:11.560 07:17:44 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:11.560 07:17:44 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.560 ************************************ 00:12:11.560 END TEST raid1_resize_superblock_test 00:12:11.560 ************************************ 00:12:11.560 07:17:44 bdev_raid -- bdev/bdev_raid.sh@935 -- # uname -s 00:12:11.819 07:17:44 bdev_raid -- bdev/bdev_raid.sh@935 -- # '[' Linux = Linux ']' 00:12:11.819 07:17:44 bdev_raid -- bdev/bdev_raid.sh@935 -- # modprobe -n nbd 00:12:11.819 07:17:44 bdev_raid -- bdev/bdev_raid.sh@936 -- # has_nbd=true 00:12:11.819 07:17:44 bdev_raid -- bdev/bdev_raid.sh@937 -- # modprobe nbd 00:12:11.819 07:17:44 bdev_raid -- bdev/bdev_raid.sh@938 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:12:11.819 07:17:44 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:11.819 07:17:44 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:11.819 07:17:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:11.819 ************************************ 00:12:11.819 START TEST raid_function_test_raid0 00:12:11.819 ************************************ 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # raid_function_test raid0 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1584042 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1584042' 00:12:11.819 Process raid pid: 1584042 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1584042 /var/tmp/spdk-raid.sock 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # '[' -z 1584042 ']' 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:11.819 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:11.819 07:17:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:12:11.820 [2024-07-25 07:17:44.208018] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:11.820 [2024-07-25 07:17:44.208075] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:11.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:11.820 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:11.820 [2024-07-25 07:17:44.341065] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:12.079 [2024-07-25 07:17:44.428323] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:12.079 [2024-07-25 07:17:44.488014] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.079 [2024-07-25 07:17:44.488047] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # return 0 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:12:12.647 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:12:12.906 [2024-07-25 07:17:45.352088] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:12.906 [2024-07-25 07:17:45.353438] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:12.906 [2024-07-25 07:17:45.353499] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11749d0 00:12:12.906 [2024-07-25 07:17:45.353510] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:12.906 [2024-07-25 07:17:45.353681] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd7c80 00:12:12.906 [2024-07-25 07:17:45.353795] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11749d0 00:12:12.906 [2024-07-25 07:17:45.353805] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x11749d0 00:12:12.906 [2024-07-25 07:17:45.353894] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.906 Base_1 00:12:12.906 Base_2 00:12:12.906 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:12.906 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:12.906 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:13.166 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:12:13.425 [2024-07-25 07:17:45.817323] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfba480 00:12:13.425 /dev/nbd0 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # local i 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@873 -- # break 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:12:13.425 1+0 records in 00:12:13.425 1+0 records out 00:12:13.425 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217163 s, 18.9 MB/s 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # size=4096 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@889 -- # return 0 00:12:13.425 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:12:13.426 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:13.426 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:13.426 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:13.426 07:17:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:12:13.687 { 00:12:13.687 "nbd_device": "/dev/nbd0", 00:12:13.687 "bdev_name": "raid" 00:12:13.687 } 00:12:13.687 ]' 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:12:13.687 { 00:12:13.687 "nbd_device": "/dev/nbd0", 00:12:13.687 "bdev_name": "raid" 00:12:13.687 } 00:12:13.687 ]' 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:12:13.687 4096+0 records in 00:12:13.687 4096+0 records out 00:12:13.687 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0297934 s, 70.4 MB/s 00:12:13.687 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:12:13.949 4096+0 records in 00:12:13.950 4096+0 records out 00:12:13.950 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.270557 s, 7.8 MB/s 00:12:13.950 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:12:14.209 128+0 records in 00:12:14.209 128+0 records out 00:12:14.209 65536 bytes (66 kB, 64 KiB) copied, 0.000825674 s, 79.4 MB/s 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:12:14.209 2035+0 records in 00:12:14.209 2035+0 records out 00:12:14.209 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0113643 s, 91.7 MB/s 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:12:14.209 456+0 records in 00:12:14.209 456+0 records out 00:12:14.209 233472 bytes (233 kB, 228 KiB) copied, 0.00270255 s, 86.4 MB/s 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:14.209 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:14.469 [2024-07-25 07:17:46.813026] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:14.469 07:17:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1584042 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # '[' -z 1584042 ']' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # kill -0 1584042 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # uname 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1584042 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1584042' 00:12:14.728 killing process with pid 1584042 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@969 -- # kill 1584042 00:12:14.728 [2024-07-25 07:17:47.165112] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:14.728 [2024-07-25 07:17:47.165177] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:14.728 [2024-07-25 07:17:47.165216] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:14.728 [2024-07-25 07:17:47.165227] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11749d0 name raid, state offline 00:12:14.728 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@974 -- # wait 1584042 00:12:14.728 [2024-07-25 07:17:47.181048] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:14.988 07:17:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:12:14.988 00:12:14.988 real 0m3.219s 00:12:14.988 user 0m4.243s 00:12:14.988 sys 0m1.230s 00:12:14.988 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:14.988 07:17:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:12:14.988 ************************************ 00:12:14.988 END TEST raid_function_test_raid0 00:12:14.988 ************************************ 00:12:14.988 07:17:47 bdev_raid -- bdev/bdev_raid.sh@939 -- # run_test raid_function_test_concat raid_function_test concat 00:12:14.988 07:17:47 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:14.988 07:17:47 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:14.988 07:17:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:14.988 ************************************ 00:12:14.988 START TEST raid_function_test_concat 00:12:14.988 ************************************ 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # raid_function_test concat 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1584662 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1584662' 00:12:14.988 Process raid pid: 1584662 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1584662 /var/tmp/spdk-raid.sock 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # '[' -z 1584662 ']' 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:14.988 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:14.988 07:17:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:12:14.988 [2024-07-25 07:17:47.513457] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:14.988 [2024-07-25 07:17:47.513520] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:15.248 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:15.248 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:15.248 [2024-07-25 07:17:47.647370] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:15.248 [2024-07-25 07:17:47.728240] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:15.507 [2024-07-25 07:17:47.799946] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:15.507 [2024-07-25 07:17:47.799980] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # return 0 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:12:16.444 [2024-07-25 07:17:48.941209] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:16.444 [2024-07-25 07:17:48.942578] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:16.444 [2024-07-25 07:17:48.942640] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xae99d0 00:12:16.444 [2024-07-25 07:17:48.942650] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:16.444 [2024-07-25 07:17:48.942824] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x94cc80 00:12:16.444 [2024-07-25 07:17:48.942936] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xae99d0 00:12:16.444 [2024-07-25 07:17:48.942945] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xae99d0 00:12:16.444 [2024-07-25 07:17:48.943040] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:16.444 Base_1 00:12:16.444 Base_2 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:16.444 07:17:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:16.703 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:12:17.272 [2024-07-25 07:17:49.675163] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x92f190 00:12:17.272 /dev/nbd0 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # local i 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@873 -- # break 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:12:17.272 1+0 records in 00:12:17.272 1+0 records out 00:12:17.272 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229682 s, 17.8 MB/s 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # size=4096 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@889 -- # return 0 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:17.272 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:17.531 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:12:17.531 { 00:12:17.531 "nbd_device": "/dev/nbd0", 00:12:17.531 "bdev_name": "raid" 00:12:17.531 } 00:12:17.531 ]' 00:12:17.531 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:12:17.531 { 00:12:17.531 "nbd_device": "/dev/nbd0", 00:12:17.531 "bdev_name": "raid" 00:12:17.531 } 00:12:17.531 ]' 00:12:17.531 07:17:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:12:17.531 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:12:17.790 4096+0 records in 00:12:17.790 4096+0 records out 00:12:17.790 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0268581 s, 78.1 MB/s 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:12:17.790 4096+0 records in 00:12:17.790 4096+0 records out 00:12:17.790 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.229326 s, 9.1 MB/s 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:12:17.790 128+0 records in 00:12:17.790 128+0 records out 00:12:17.790 65536 bytes (66 kB, 64 KiB) copied, 0.000822887 s, 79.6 MB/s 00:12:17.790 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:12:18.050 2035+0 records in 00:12:18.050 2035+0 records out 00:12:18.050 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0117988 s, 88.3 MB/s 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:12:18.050 456+0 records in 00:12:18.050 456+0 records out 00:12:18.050 233472 bytes (233 kB, 228 KiB) copied, 0.00269677 s, 86.6 MB/s 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:18.050 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:18.309 [2024-07-25 07:17:50.641609] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:12:18.309 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1584662 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # '[' -z 1584662 ']' 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # kill -0 1584662 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # uname 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1584662 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1584662' 00:12:18.569 killing process with pid 1584662 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@969 -- # kill 1584662 00:12:18.569 [2024-07-25 07:17:50.939874] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:18.569 [2024-07-25 07:17:50.939934] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:18.569 [2024-07-25 07:17:50.939973] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:18.569 [2024-07-25 07:17:50.939984] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xae99d0 name raid, state offline 00:12:18.569 07:17:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@974 -- # wait 1584662 00:12:18.569 [2024-07-25 07:17:50.955806] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:18.828 07:17:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:12:18.828 00:12:18.828 real 0m3.691s 00:12:18.828 user 0m5.142s 00:12:18.828 sys 0m1.236s 00:12:18.828 07:17:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:18.828 07:17:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:12:18.828 ************************************ 00:12:18.828 END TEST raid_function_test_concat 00:12:18.828 ************************************ 00:12:18.828 07:17:51 bdev_raid -- bdev/bdev_raid.sh@942 -- # run_test raid0_resize_test raid_resize_test 0 00:12:18.828 07:17:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:18.828 07:17:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:18.828 07:17:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:18.828 ************************************ 00:12:18.828 START TEST raid0_resize_test 00:12:18.828 ************************************ 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 0 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=0 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=1585294 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 1585294' 00:12:18.828 Process raid pid: 1585294 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 1585294 /var/tmp/spdk-raid.sock 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # '[' -z 1585294 ']' 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:18.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:18.828 07:17:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.828 [2024-07-25 07:17:51.254276] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:18.828 [2024-07-25 07:17:51.254330] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:18.828 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.828 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:18.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:18.829 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:19.088 [2024-07-25 07:17:51.387617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:19.088 [2024-07-25 07:17:51.473333] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:19.088 [2024-07-25 07:17:51.532029] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:19.088 [2024-07-25 07:17:51.532055] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:19.656 07:17:52 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:19.656 07:17:52 bdev_raid.raid0_resize_test -- common/autotest_common.sh@864 -- # return 0 00:12:19.656 07:17:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:12:19.916 Base_1 00:12:19.916 07:17:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:12:20.175 Base_2 00:12:20.175 07:17:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 0 -eq 0 ']' 00:12:20.175 07:17:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:12:20.435 [2024-07-25 07:17:52.782606] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:20.435 [2024-07-25 07:17:52.783978] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:20.435 [2024-07-25 07:17:52.784028] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15c1c00 00:12:20.435 [2024-07-25 07:17:52.784037] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:20.435 [2024-07-25 07:17:52.784490] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1105090 00:12:20.435 [2024-07-25 07:17:52.784588] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15c1c00 00:12:20.435 [2024-07-25 07:17:52.784597] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x15c1c00 00:12:20.435 [2024-07-25 07:17:52.784694] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:20.435 07:17:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:12:20.704 [2024-07-25 07:17:53.007182] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:20.704 [2024-07-25 07:17:53.007197] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:12:20.704 true 00:12:20.704 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:20.704 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:12:20.704 [2024-07-25 07:17:53.231904] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=131072 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=64 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 0 -eq 0 ']' 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # expected_size=64 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 64 '!=' 64 ']' 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:12:20.975 [2024-07-25 07:17:53.448312] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:20.975 [2024-07-25 07:17:53.448330] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:12:20.975 [2024-07-25 07:17:53.448353] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:12:20.975 true 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:20.975 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:12:21.234 [2024-07-25 07:17:53.677053] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:21.234 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=262144 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=128 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 0 -eq 0 ']' 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@393 -- # expected_size=128 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 128 '!=' 128 ']' 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 1585294 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # '[' -z 1585294 ']' 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # kill -0 1585294 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # uname 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1585294 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1585294' 00:12:21.235 killing process with pid 1585294 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@969 -- # kill 1585294 00:12:21.235 [2024-07-25 07:17:53.749474] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:21.235 [2024-07-25 07:17:53.749529] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:21.235 [2024-07-25 07:17:53.749569] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:21.235 [2024-07-25 07:17:53.749580] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15c1c00 name Raid, state offline 00:12:21.235 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@974 -- # wait 1585294 00:12:21.235 [2024-07-25 07:17:53.750780] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:21.494 07:17:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:12:21.494 00:12:21.494 real 0m2.715s 00:12:21.494 user 0m4.185s 00:12:21.494 sys 0m0.555s 00:12:21.494 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:21.494 07:17:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.494 ************************************ 00:12:21.494 END TEST raid0_resize_test 00:12:21.494 ************************************ 00:12:21.494 07:17:53 bdev_raid -- bdev/bdev_raid.sh@943 -- # run_test raid1_resize_test raid_resize_test 1 00:12:21.494 07:17:53 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:21.494 07:17:53 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:21.494 07:17:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:21.494 ************************************ 00:12:21.494 START TEST raid1_resize_test 00:12:21.494 ************************************ 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 1 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=1 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:12:21.494 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=1585842 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 1585842' 00:12:21.495 Process raid pid: 1585842 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 1585842 /var/tmp/spdk-raid.sock 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@831 -- # '[' -z 1585842 ']' 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:21.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:21.495 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.754 [2024-07-25 07:17:54.071048] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:21.754 [2024-07-25 07:17:54.071108] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:21.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:21.754 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:21.754 [2024-07-25 07:17:54.202926] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:21.754 [2024-07-25 07:17:54.284245] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:22.014 [2024-07-25 07:17:54.342561] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.014 [2024-07-25 07:17:54.342586] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.582 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:22.582 07:17:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@864 -- # return 0 00:12:22.582 07:17:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:12:22.841 Base_1 00:12:22.841 07:17:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:12:23.100 Base_2 00:12:23.100 07:17:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 1 -eq 0 ']' 00:12:23.100 07:17:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@367 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r 1 -b 'Base_1 Base_2' -n Raid 00:12:23.100 [2024-07-25 07:17:55.634702] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:23.359 [2024-07-25 07:17:55.636024] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:23.360 [2024-07-25 07:17:55.636079] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1448c00 00:12:23.360 [2024-07-25 07:17:55.636089] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:23.360 [2024-07-25 07:17:55.636269] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf8c090 00:12:23.360 [2024-07-25 07:17:55.636361] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1448c00 00:12:23.360 [2024-07-25 07:17:55.636370] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1448c00 00:12:23.360 [2024-07-25 07:17:55.636455] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:23.360 07:17:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:12:23.360 [2024-07-25 07:17:55.863287] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:23.360 [2024-07-25 07:17:55.863304] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:12:23.360 true 00:12:23.360 07:17:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:23.360 07:17:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:12:23.619 [2024-07-25 07:17:56.096045] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:23.619 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=65536 00:12:23.619 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=32 00:12:23.619 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 1 -eq 0 ']' 00:12:23.619 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@379 -- # expected_size=32 00:12:23.619 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 32 '!=' 32 ']' 00:12:23.619 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:12:23.878 [2024-07-25 07:17:56.328505] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:23.878 [2024-07-25 07:17:56.328527] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:12:23.878 [2024-07-25 07:17:56.328554] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 65536 to 131072 00:12:23.878 true 00:12:23.878 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:23.878 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:12:24.138 [2024-07-25 07:17:56.557256] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=131072 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=64 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 1 -eq 0 ']' 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@395 -- # expected_size=64 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 64 '!=' 64 ']' 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 1585842 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@950 -- # '[' -z 1585842 ']' 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@954 -- # kill -0 1585842 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # uname 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1585842 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1585842' 00:12:24.138 killing process with pid 1585842 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@969 -- # kill 1585842 00:12:24.138 [2024-07-25 07:17:56.635119] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:24.138 [2024-07-25 07:17:56.635180] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:24.138 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@974 -- # wait 1585842 00:12:24.138 [2024-07-25 07:17:56.635504] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:24.138 [2024-07-25 07:17:56.635517] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1448c00 name Raid, state offline 00:12:24.138 [2024-07-25 07:17:56.636435] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.397 07:17:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:12:24.397 00:12:24.397 real 0m2.803s 00:12:24.397 user 0m4.262s 00:12:24.398 sys 0m0.648s 00:12:24.398 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:24.398 07:17:56 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.398 ************************************ 00:12:24.398 END TEST raid1_resize_test 00:12:24.398 ************************************ 00:12:24.398 07:17:56 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:12:24.398 07:17:56 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:12:24.398 07:17:56 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:12:24.398 07:17:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:24.398 07:17:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:24.398 07:17:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.398 ************************************ 00:12:24.398 START TEST raid_state_function_test 00:12:24.398 ************************************ 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 false 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1586408 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1586408' 00:12:24.398 Process raid pid: 1586408 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1586408 /var/tmp/spdk-raid.sock 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1586408 ']' 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.398 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:24.398 07:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.658 [2024-07-25 07:17:56.958521] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:24.658 [2024-07-25 07:17:56.958576] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:24.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:24.658 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:24.658 [2024-07-25 07:17:57.092893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.659 [2024-07-25 07:17:57.179025] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.918 [2024-07-25 07:17:57.239475] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.918 [2024-07-25 07:17:57.239509] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.486 07:17:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:25.486 07:17:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:12:25.486 07:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:25.745 [2024-07-25 07:17:58.063031] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.745 [2024-07-25 07:17:58.063067] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.745 [2024-07-25 07:17:58.063077] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:25.745 [2024-07-25 07:17:58.063088] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.745 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.004 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.004 "name": "Existed_Raid", 00:12:26.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.004 "strip_size_kb": 64, 00:12:26.004 "state": "configuring", 00:12:26.004 "raid_level": "raid0", 00:12:26.004 "superblock": false, 00:12:26.004 "num_base_bdevs": 2, 00:12:26.004 "num_base_bdevs_discovered": 0, 00:12:26.004 "num_base_bdevs_operational": 2, 00:12:26.005 "base_bdevs_list": [ 00:12:26.005 { 00:12:26.005 "name": "BaseBdev1", 00:12:26.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.005 "is_configured": false, 00:12:26.005 "data_offset": 0, 00:12:26.005 "data_size": 0 00:12:26.005 }, 00:12:26.005 { 00:12:26.005 "name": "BaseBdev2", 00:12:26.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.005 "is_configured": false, 00:12:26.005 "data_offset": 0, 00:12:26.005 "data_size": 0 00:12:26.005 } 00:12:26.005 ] 00:12:26.005 }' 00:12:26.005 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.005 07:17:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.573 07:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:26.573 [2024-07-25 07:17:59.081608] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:26.573 [2024-07-25 07:17:59.081635] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2aea0 name Existed_Raid, state configuring 00:12:26.573 07:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:26.832 [2024-07-25 07:17:59.314221] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:26.832 [2024-07-25 07:17:59.314244] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:26.832 [2024-07-25 07:17:59.314253] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:26.832 [2024-07-25 07:17:59.314263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:26.832 07:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:27.091 [2024-07-25 07:17:59.552179] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:27.091 BaseBdev1 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:27.091 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.350 07:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:27.609 [ 00:12:27.609 { 00:12:27.609 "name": "BaseBdev1", 00:12:27.609 "aliases": [ 00:12:27.609 "97b9625a-c578-406f-8971-e6365ff8dc45" 00:12:27.609 ], 00:12:27.609 "product_name": "Malloc disk", 00:12:27.609 "block_size": 512, 00:12:27.609 "num_blocks": 65536, 00:12:27.609 "uuid": "97b9625a-c578-406f-8971-e6365ff8dc45", 00:12:27.609 "assigned_rate_limits": { 00:12:27.609 "rw_ios_per_sec": 0, 00:12:27.609 "rw_mbytes_per_sec": 0, 00:12:27.609 "r_mbytes_per_sec": 0, 00:12:27.609 "w_mbytes_per_sec": 0 00:12:27.609 }, 00:12:27.609 "claimed": true, 00:12:27.609 "claim_type": "exclusive_write", 00:12:27.609 "zoned": false, 00:12:27.609 "supported_io_types": { 00:12:27.609 "read": true, 00:12:27.609 "write": true, 00:12:27.609 "unmap": true, 00:12:27.609 "flush": true, 00:12:27.609 "reset": true, 00:12:27.609 "nvme_admin": false, 00:12:27.609 "nvme_io": false, 00:12:27.609 "nvme_io_md": false, 00:12:27.609 "write_zeroes": true, 00:12:27.609 "zcopy": true, 00:12:27.609 "get_zone_info": false, 00:12:27.609 "zone_management": false, 00:12:27.609 "zone_append": false, 00:12:27.609 "compare": false, 00:12:27.609 "compare_and_write": false, 00:12:27.609 "abort": true, 00:12:27.609 "seek_hole": false, 00:12:27.609 "seek_data": false, 00:12:27.609 "copy": true, 00:12:27.609 "nvme_iov_md": false 00:12:27.609 }, 00:12:27.609 "memory_domains": [ 00:12:27.609 { 00:12:27.609 "dma_device_id": "system", 00:12:27.609 "dma_device_type": 1 00:12:27.609 }, 00:12:27.609 { 00:12:27.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.609 "dma_device_type": 2 00:12:27.609 } 00:12:27.609 ], 00:12:27.609 "driver_specific": {} 00:12:27.609 } 00:12:27.609 ] 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.609 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.868 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.868 "name": "Existed_Raid", 00:12:27.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.868 "strip_size_kb": 64, 00:12:27.868 "state": "configuring", 00:12:27.868 "raid_level": "raid0", 00:12:27.868 "superblock": false, 00:12:27.868 "num_base_bdevs": 2, 00:12:27.868 "num_base_bdevs_discovered": 1, 00:12:27.868 "num_base_bdevs_operational": 2, 00:12:27.868 "base_bdevs_list": [ 00:12:27.868 { 00:12:27.868 "name": "BaseBdev1", 00:12:27.868 "uuid": "97b9625a-c578-406f-8971-e6365ff8dc45", 00:12:27.868 "is_configured": true, 00:12:27.868 "data_offset": 0, 00:12:27.868 "data_size": 65536 00:12:27.869 }, 00:12:27.869 { 00:12:27.869 "name": "BaseBdev2", 00:12:27.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.869 "is_configured": false, 00:12:27.869 "data_offset": 0, 00:12:27.869 "data_size": 0 00:12:27.869 } 00:12:27.869 ] 00:12:27.869 }' 00:12:27.869 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.869 07:18:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.437 07:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:28.696 [2024-07-25 07:18:01.036078] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:28.696 [2024-07-25 07:18:01.036116] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2a790 name Existed_Raid, state configuring 00:12:28.696 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:28.955 [2024-07-25 07:18:01.260894] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:28.955 [2024-07-25 07:18:01.262284] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:28.955 [2024-07-25 07:18:01.262315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.955 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.214 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.214 "name": "Existed_Raid", 00:12:29.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.214 "strip_size_kb": 64, 00:12:29.214 "state": "configuring", 00:12:29.214 "raid_level": "raid0", 00:12:29.214 "superblock": false, 00:12:29.214 "num_base_bdevs": 2, 00:12:29.214 "num_base_bdevs_discovered": 1, 00:12:29.214 "num_base_bdevs_operational": 2, 00:12:29.214 "base_bdevs_list": [ 00:12:29.214 { 00:12:29.214 "name": "BaseBdev1", 00:12:29.214 "uuid": "97b9625a-c578-406f-8971-e6365ff8dc45", 00:12:29.214 "is_configured": true, 00:12:29.214 "data_offset": 0, 00:12:29.214 "data_size": 65536 00:12:29.214 }, 00:12:29.214 { 00:12:29.214 "name": "BaseBdev2", 00:12:29.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.214 "is_configured": false, 00:12:29.214 "data_offset": 0, 00:12:29.214 "data_size": 0 00:12:29.214 } 00:12:29.214 ] 00:12:29.214 }' 00:12:29.214 07:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.214 07:18:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.782 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:29.782 [2024-07-25 07:18:02.306803] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:29.782 [2024-07-25 07:18:02.306834] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f2b580 00:12:29.782 [2024-07-25 07:18:02.306842] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:29.782 [2024-07-25 07:18:02.307017] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f21d00 00:12:29.782 [2024-07-25 07:18:02.307129] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f2b580 00:12:29.782 [2024-07-25 07:18:02.307148] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f2b580 00:12:29.782 [2024-07-25 07:18:02.307300] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.782 BaseBdev2 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:30.042 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:30.301 [ 00:12:30.301 { 00:12:30.301 "name": "BaseBdev2", 00:12:30.301 "aliases": [ 00:12:30.301 "e68c24c5-c5cf-49a0-af9c-befc101bc9fc" 00:12:30.301 ], 00:12:30.301 "product_name": "Malloc disk", 00:12:30.301 "block_size": 512, 00:12:30.301 "num_blocks": 65536, 00:12:30.301 "uuid": "e68c24c5-c5cf-49a0-af9c-befc101bc9fc", 00:12:30.301 "assigned_rate_limits": { 00:12:30.301 "rw_ios_per_sec": 0, 00:12:30.301 "rw_mbytes_per_sec": 0, 00:12:30.301 "r_mbytes_per_sec": 0, 00:12:30.301 "w_mbytes_per_sec": 0 00:12:30.301 }, 00:12:30.301 "claimed": true, 00:12:30.301 "claim_type": "exclusive_write", 00:12:30.301 "zoned": false, 00:12:30.301 "supported_io_types": { 00:12:30.301 "read": true, 00:12:30.301 "write": true, 00:12:30.301 "unmap": true, 00:12:30.301 "flush": true, 00:12:30.301 "reset": true, 00:12:30.301 "nvme_admin": false, 00:12:30.301 "nvme_io": false, 00:12:30.301 "nvme_io_md": false, 00:12:30.301 "write_zeroes": true, 00:12:30.301 "zcopy": true, 00:12:30.301 "get_zone_info": false, 00:12:30.301 "zone_management": false, 00:12:30.301 "zone_append": false, 00:12:30.301 "compare": false, 00:12:30.301 "compare_and_write": false, 00:12:30.301 "abort": true, 00:12:30.301 "seek_hole": false, 00:12:30.301 "seek_data": false, 00:12:30.301 "copy": true, 00:12:30.301 "nvme_iov_md": false 00:12:30.301 }, 00:12:30.301 "memory_domains": [ 00:12:30.301 { 00:12:30.301 "dma_device_id": "system", 00:12:30.301 "dma_device_type": 1 00:12:30.301 }, 00:12:30.301 { 00:12:30.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.301 "dma_device_type": 2 00:12:30.301 } 00:12:30.301 ], 00:12:30.301 "driver_specific": {} 00:12:30.301 } 00:12:30.301 ] 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.301 07:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.560 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.560 "name": "Existed_Raid", 00:12:30.560 "uuid": "97fe8605-715e-4eb0-b3fb-012b2690c5f4", 00:12:30.560 "strip_size_kb": 64, 00:12:30.560 "state": "online", 00:12:30.560 "raid_level": "raid0", 00:12:30.560 "superblock": false, 00:12:30.560 "num_base_bdevs": 2, 00:12:30.560 "num_base_bdevs_discovered": 2, 00:12:30.560 "num_base_bdevs_operational": 2, 00:12:30.560 "base_bdevs_list": [ 00:12:30.560 { 00:12:30.560 "name": "BaseBdev1", 00:12:30.560 "uuid": "97b9625a-c578-406f-8971-e6365ff8dc45", 00:12:30.560 "is_configured": true, 00:12:30.560 "data_offset": 0, 00:12:30.560 "data_size": 65536 00:12:30.560 }, 00:12:30.560 { 00:12:30.560 "name": "BaseBdev2", 00:12:30.560 "uuid": "e68c24c5-c5cf-49a0-af9c-befc101bc9fc", 00:12:30.560 "is_configured": true, 00:12:30.560 "data_offset": 0, 00:12:30.560 "data_size": 65536 00:12:30.560 } 00:12:30.560 ] 00:12:30.560 }' 00:12:30.560 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.560 07:18:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:31.128 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:31.387 [2024-07-25 07:18:03.778929] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:31.387 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:31.387 "name": "Existed_Raid", 00:12:31.387 "aliases": [ 00:12:31.387 "97fe8605-715e-4eb0-b3fb-012b2690c5f4" 00:12:31.387 ], 00:12:31.387 "product_name": "Raid Volume", 00:12:31.387 "block_size": 512, 00:12:31.387 "num_blocks": 131072, 00:12:31.387 "uuid": "97fe8605-715e-4eb0-b3fb-012b2690c5f4", 00:12:31.387 "assigned_rate_limits": { 00:12:31.387 "rw_ios_per_sec": 0, 00:12:31.387 "rw_mbytes_per_sec": 0, 00:12:31.387 "r_mbytes_per_sec": 0, 00:12:31.387 "w_mbytes_per_sec": 0 00:12:31.387 }, 00:12:31.387 "claimed": false, 00:12:31.387 "zoned": false, 00:12:31.387 "supported_io_types": { 00:12:31.387 "read": true, 00:12:31.387 "write": true, 00:12:31.387 "unmap": true, 00:12:31.387 "flush": true, 00:12:31.387 "reset": true, 00:12:31.387 "nvme_admin": false, 00:12:31.387 "nvme_io": false, 00:12:31.387 "nvme_io_md": false, 00:12:31.387 "write_zeroes": true, 00:12:31.387 "zcopy": false, 00:12:31.387 "get_zone_info": false, 00:12:31.387 "zone_management": false, 00:12:31.387 "zone_append": false, 00:12:31.387 "compare": false, 00:12:31.387 "compare_and_write": false, 00:12:31.387 "abort": false, 00:12:31.387 "seek_hole": false, 00:12:31.387 "seek_data": false, 00:12:31.387 "copy": false, 00:12:31.387 "nvme_iov_md": false 00:12:31.387 }, 00:12:31.387 "memory_domains": [ 00:12:31.387 { 00:12:31.387 "dma_device_id": "system", 00:12:31.387 "dma_device_type": 1 00:12:31.387 }, 00:12:31.387 { 00:12:31.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.387 "dma_device_type": 2 00:12:31.387 }, 00:12:31.387 { 00:12:31.387 "dma_device_id": "system", 00:12:31.387 "dma_device_type": 1 00:12:31.387 }, 00:12:31.387 { 00:12:31.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.387 "dma_device_type": 2 00:12:31.387 } 00:12:31.387 ], 00:12:31.387 "driver_specific": { 00:12:31.387 "raid": { 00:12:31.387 "uuid": "97fe8605-715e-4eb0-b3fb-012b2690c5f4", 00:12:31.387 "strip_size_kb": 64, 00:12:31.387 "state": "online", 00:12:31.387 "raid_level": "raid0", 00:12:31.387 "superblock": false, 00:12:31.387 "num_base_bdevs": 2, 00:12:31.387 "num_base_bdevs_discovered": 2, 00:12:31.387 "num_base_bdevs_operational": 2, 00:12:31.387 "base_bdevs_list": [ 00:12:31.387 { 00:12:31.387 "name": "BaseBdev1", 00:12:31.387 "uuid": "97b9625a-c578-406f-8971-e6365ff8dc45", 00:12:31.387 "is_configured": true, 00:12:31.387 "data_offset": 0, 00:12:31.387 "data_size": 65536 00:12:31.387 }, 00:12:31.387 { 00:12:31.387 "name": "BaseBdev2", 00:12:31.387 "uuid": "e68c24c5-c5cf-49a0-af9c-befc101bc9fc", 00:12:31.387 "is_configured": true, 00:12:31.387 "data_offset": 0, 00:12:31.387 "data_size": 65536 00:12:31.387 } 00:12:31.387 ] 00:12:31.387 } 00:12:31.387 } 00:12:31.387 }' 00:12:31.387 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:31.387 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:31.387 BaseBdev2' 00:12:31.387 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.387 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:31.387 07:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.647 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.647 "name": "BaseBdev1", 00:12:31.647 "aliases": [ 00:12:31.647 "97b9625a-c578-406f-8971-e6365ff8dc45" 00:12:31.647 ], 00:12:31.647 "product_name": "Malloc disk", 00:12:31.647 "block_size": 512, 00:12:31.647 "num_blocks": 65536, 00:12:31.647 "uuid": "97b9625a-c578-406f-8971-e6365ff8dc45", 00:12:31.647 "assigned_rate_limits": { 00:12:31.647 "rw_ios_per_sec": 0, 00:12:31.647 "rw_mbytes_per_sec": 0, 00:12:31.647 "r_mbytes_per_sec": 0, 00:12:31.647 "w_mbytes_per_sec": 0 00:12:31.647 }, 00:12:31.647 "claimed": true, 00:12:31.647 "claim_type": "exclusive_write", 00:12:31.647 "zoned": false, 00:12:31.647 "supported_io_types": { 00:12:31.647 "read": true, 00:12:31.647 "write": true, 00:12:31.647 "unmap": true, 00:12:31.647 "flush": true, 00:12:31.647 "reset": true, 00:12:31.647 "nvme_admin": false, 00:12:31.647 "nvme_io": false, 00:12:31.647 "nvme_io_md": false, 00:12:31.647 "write_zeroes": true, 00:12:31.647 "zcopy": true, 00:12:31.647 "get_zone_info": false, 00:12:31.647 "zone_management": false, 00:12:31.647 "zone_append": false, 00:12:31.647 "compare": false, 00:12:31.647 "compare_and_write": false, 00:12:31.647 "abort": true, 00:12:31.647 "seek_hole": false, 00:12:31.647 "seek_data": false, 00:12:31.647 "copy": true, 00:12:31.647 "nvme_iov_md": false 00:12:31.647 }, 00:12:31.647 "memory_domains": [ 00:12:31.647 { 00:12:31.647 "dma_device_id": "system", 00:12:31.647 "dma_device_type": 1 00:12:31.647 }, 00:12:31.647 { 00:12:31.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.647 "dma_device_type": 2 00:12:31.647 } 00:12:31.647 ], 00:12:31.647 "driver_specific": {} 00:12:31.647 }' 00:12:31.647 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.647 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.647 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:31.647 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:31.906 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.165 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.165 "name": "BaseBdev2", 00:12:32.165 "aliases": [ 00:12:32.165 "e68c24c5-c5cf-49a0-af9c-befc101bc9fc" 00:12:32.165 ], 00:12:32.165 "product_name": "Malloc disk", 00:12:32.165 "block_size": 512, 00:12:32.165 "num_blocks": 65536, 00:12:32.165 "uuid": "e68c24c5-c5cf-49a0-af9c-befc101bc9fc", 00:12:32.165 "assigned_rate_limits": { 00:12:32.165 "rw_ios_per_sec": 0, 00:12:32.165 "rw_mbytes_per_sec": 0, 00:12:32.165 "r_mbytes_per_sec": 0, 00:12:32.165 "w_mbytes_per_sec": 0 00:12:32.165 }, 00:12:32.165 "claimed": true, 00:12:32.165 "claim_type": "exclusive_write", 00:12:32.165 "zoned": false, 00:12:32.165 "supported_io_types": { 00:12:32.165 "read": true, 00:12:32.165 "write": true, 00:12:32.165 "unmap": true, 00:12:32.165 "flush": true, 00:12:32.165 "reset": true, 00:12:32.165 "nvme_admin": false, 00:12:32.165 "nvme_io": false, 00:12:32.165 "nvme_io_md": false, 00:12:32.165 "write_zeroes": true, 00:12:32.165 "zcopy": true, 00:12:32.165 "get_zone_info": false, 00:12:32.165 "zone_management": false, 00:12:32.165 "zone_append": false, 00:12:32.165 "compare": false, 00:12:32.165 "compare_and_write": false, 00:12:32.165 "abort": true, 00:12:32.165 "seek_hole": false, 00:12:32.165 "seek_data": false, 00:12:32.165 "copy": true, 00:12:32.165 "nvme_iov_md": false 00:12:32.165 }, 00:12:32.165 "memory_domains": [ 00:12:32.165 { 00:12:32.165 "dma_device_id": "system", 00:12:32.165 "dma_device_type": 1 00:12:32.165 }, 00:12:32.165 { 00:12:32.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.165 "dma_device_type": 2 00:12:32.165 } 00:12:32.165 ], 00:12:32.165 "driver_specific": {} 00:12:32.165 }' 00:12:32.165 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.165 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.424 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.425 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.684 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.684 07:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:32.684 [2024-07-25 07:18:05.194466] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:32.684 [2024-07-25 07:18:05.194491] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:32.684 [2024-07-25 07:18:05.194529] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:32.684 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.943 "name": "Existed_Raid", 00:12:32.943 "uuid": "97fe8605-715e-4eb0-b3fb-012b2690c5f4", 00:12:32.943 "strip_size_kb": 64, 00:12:32.943 "state": "offline", 00:12:32.943 "raid_level": "raid0", 00:12:32.943 "superblock": false, 00:12:32.943 "num_base_bdevs": 2, 00:12:32.943 "num_base_bdevs_discovered": 1, 00:12:32.943 "num_base_bdevs_operational": 1, 00:12:32.943 "base_bdevs_list": [ 00:12:32.943 { 00:12:32.943 "name": null, 00:12:32.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.943 "is_configured": false, 00:12:32.943 "data_offset": 0, 00:12:32.943 "data_size": 65536 00:12:32.943 }, 00:12:32.943 { 00:12:32.943 "name": "BaseBdev2", 00:12:32.943 "uuid": "e68c24c5-c5cf-49a0-af9c-befc101bc9fc", 00:12:32.943 "is_configured": true, 00:12:32.943 "data_offset": 0, 00:12:32.943 "data_size": 65536 00:12:32.943 } 00:12:32.943 ] 00:12:32.943 }' 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.943 07:18:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.510 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:33.510 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:33.510 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.510 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:33.768 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:33.768 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:33.768 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:34.027 [2024-07-25 07:18:06.462839] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:34.027 [2024-07-25 07:18:06.462887] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2b580 name Existed_Raid, state offline 00:12:34.027 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:34.027 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:34.027 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:34.027 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1586408 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1586408 ']' 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1586408 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:12:34.286 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:34.287 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1586408 00:12:34.287 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:34.287 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:34.287 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1586408' 00:12:34.287 killing process with pid 1586408 00:12:34.287 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1586408 00:12:34.287 [2024-07-25 07:18:06.777347] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:34.287 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1586408 00:12:34.287 [2024-07-25 07:18:06.778219] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:34.546 07:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:34.546 00:12:34.546 real 0m10.082s 00:12:34.546 user 0m17.863s 00:12:34.546 sys 0m1.913s 00:12:34.546 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:34.546 07:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.546 ************************************ 00:12:34.546 END TEST raid_state_function_test 00:12:34.546 ************************************ 00:12:34.546 07:18:07 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:12:34.546 07:18:07 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:34.546 07:18:07 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:34.546 07:18:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:34.546 ************************************ 00:12:34.546 START TEST raid_state_function_test_sb 00:12:34.546 ************************************ 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 true 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1588904 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1588904' 00:12:34.546 Process raid pid: 1588904 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1588904 /var/tmp/spdk-raid.sock 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1588904 ']' 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:34.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:34.546 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:34.875 [2024-07-25 07:18:07.110084] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:34.875 [2024-07-25 07:18:07.110150] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:34.875 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:34.875 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:34.876 [2024-07-25 07:18:07.241965] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:34.876 [2024-07-25 07:18:07.328054] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:35.134 [2024-07-25 07:18:07.382699] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.134 [2024-07-25 07:18:07.382724] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.701 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:35.701 07:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:12:35.701 07:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:35.701 [2024-07-25 07:18:08.140195] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:35.702 [2024-07-25 07:18:08.140230] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:35.702 [2024-07-25 07:18:08.140240] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:35.702 [2024-07-25 07:18:08.140251] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.702 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.269 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.269 "name": "Existed_Raid", 00:12:36.269 "uuid": "0fc2f183-efa0-4bb9-8c87-17a579724d78", 00:12:36.269 "strip_size_kb": 64, 00:12:36.269 "state": "configuring", 00:12:36.269 "raid_level": "raid0", 00:12:36.269 "superblock": true, 00:12:36.269 "num_base_bdevs": 2, 00:12:36.269 "num_base_bdevs_discovered": 0, 00:12:36.269 "num_base_bdevs_operational": 2, 00:12:36.269 "base_bdevs_list": [ 00:12:36.269 { 00:12:36.269 "name": "BaseBdev1", 00:12:36.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.269 "is_configured": false, 00:12:36.269 "data_offset": 0, 00:12:36.269 "data_size": 0 00:12:36.269 }, 00:12:36.269 { 00:12:36.269 "name": "BaseBdev2", 00:12:36.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.269 "is_configured": false, 00:12:36.269 "data_offset": 0, 00:12:36.269 "data_size": 0 00:12:36.269 } 00:12:36.269 ] 00:12:36.269 }' 00:12:36.269 07:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.269 07:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.836 07:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:36.836 [2024-07-25 07:18:09.367289] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:36.836 [2024-07-25 07:18:09.367313] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1522ea0 name Existed_Raid, state configuring 00:12:37.095 07:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:37.095 [2024-07-25 07:18:09.595909] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:37.095 [2024-07-25 07:18:09.595932] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:37.095 [2024-07-25 07:18:09.595941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:37.095 [2024-07-25 07:18:09.595952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:37.095 07:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:37.354 [2024-07-25 07:18:09.838021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:37.354 BaseBdev1 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:37.354 07:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.613 07:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:37.872 [ 00:12:37.872 { 00:12:37.872 "name": "BaseBdev1", 00:12:37.872 "aliases": [ 00:12:37.872 "c8a2c754-4c6c-4e38-9496-a96ac00f4843" 00:12:37.872 ], 00:12:37.872 "product_name": "Malloc disk", 00:12:37.872 "block_size": 512, 00:12:37.872 "num_blocks": 65536, 00:12:37.872 "uuid": "c8a2c754-4c6c-4e38-9496-a96ac00f4843", 00:12:37.872 "assigned_rate_limits": { 00:12:37.872 "rw_ios_per_sec": 0, 00:12:37.872 "rw_mbytes_per_sec": 0, 00:12:37.872 "r_mbytes_per_sec": 0, 00:12:37.872 "w_mbytes_per_sec": 0 00:12:37.872 }, 00:12:37.872 "claimed": true, 00:12:37.872 "claim_type": "exclusive_write", 00:12:37.872 "zoned": false, 00:12:37.872 "supported_io_types": { 00:12:37.872 "read": true, 00:12:37.872 "write": true, 00:12:37.872 "unmap": true, 00:12:37.872 "flush": true, 00:12:37.872 "reset": true, 00:12:37.872 "nvme_admin": false, 00:12:37.872 "nvme_io": false, 00:12:37.872 "nvme_io_md": false, 00:12:37.872 "write_zeroes": true, 00:12:37.872 "zcopy": true, 00:12:37.872 "get_zone_info": false, 00:12:37.872 "zone_management": false, 00:12:37.872 "zone_append": false, 00:12:37.872 "compare": false, 00:12:37.872 "compare_and_write": false, 00:12:37.872 "abort": true, 00:12:37.872 "seek_hole": false, 00:12:37.872 "seek_data": false, 00:12:37.872 "copy": true, 00:12:37.872 "nvme_iov_md": false 00:12:37.872 }, 00:12:37.872 "memory_domains": [ 00:12:37.872 { 00:12:37.872 "dma_device_id": "system", 00:12:37.872 "dma_device_type": 1 00:12:37.872 }, 00:12:37.872 { 00:12:37.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.872 "dma_device_type": 2 00:12:37.872 } 00:12:37.872 ], 00:12:37.872 "driver_specific": {} 00:12:37.872 } 00:12:37.872 ] 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.872 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.131 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.131 "name": "Existed_Raid", 00:12:38.132 "uuid": "8c105f92-38af-46fc-a050-c7d4871c53cd", 00:12:38.132 "strip_size_kb": 64, 00:12:38.132 "state": "configuring", 00:12:38.132 "raid_level": "raid0", 00:12:38.132 "superblock": true, 00:12:38.132 "num_base_bdevs": 2, 00:12:38.132 "num_base_bdevs_discovered": 1, 00:12:38.132 "num_base_bdevs_operational": 2, 00:12:38.132 "base_bdevs_list": [ 00:12:38.132 { 00:12:38.132 "name": "BaseBdev1", 00:12:38.132 "uuid": "c8a2c754-4c6c-4e38-9496-a96ac00f4843", 00:12:38.132 "is_configured": true, 00:12:38.132 "data_offset": 2048, 00:12:38.132 "data_size": 63488 00:12:38.132 }, 00:12:38.132 { 00:12:38.132 "name": "BaseBdev2", 00:12:38.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.132 "is_configured": false, 00:12:38.132 "data_offset": 0, 00:12:38.132 "data_size": 0 00:12:38.132 } 00:12:38.132 ] 00:12:38.132 }' 00:12:38.132 07:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.132 07:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:38.700 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:38.700 [2024-07-25 07:18:11.213691] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:38.700 [2024-07-25 07:18:11.213723] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1522790 name Existed_Raid, state configuring 00:12:38.700 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:38.959 [2024-07-25 07:18:11.434304] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:38.959 [2024-07-25 07:18:11.435681] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:38.959 [2024-07-25 07:18:11.435710] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.959 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.218 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.218 "name": "Existed_Raid", 00:12:39.218 "uuid": "3f02a5be-3126-4b39-bfe2-5bde08137249", 00:12:39.218 "strip_size_kb": 64, 00:12:39.218 "state": "configuring", 00:12:39.218 "raid_level": "raid0", 00:12:39.218 "superblock": true, 00:12:39.218 "num_base_bdevs": 2, 00:12:39.218 "num_base_bdevs_discovered": 1, 00:12:39.218 "num_base_bdevs_operational": 2, 00:12:39.218 "base_bdevs_list": [ 00:12:39.218 { 00:12:39.218 "name": "BaseBdev1", 00:12:39.218 "uuid": "c8a2c754-4c6c-4e38-9496-a96ac00f4843", 00:12:39.218 "is_configured": true, 00:12:39.218 "data_offset": 2048, 00:12:39.218 "data_size": 63488 00:12:39.218 }, 00:12:39.218 { 00:12:39.218 "name": "BaseBdev2", 00:12:39.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.218 "is_configured": false, 00:12:39.218 "data_offset": 0, 00:12:39.218 "data_size": 0 00:12:39.218 } 00:12:39.218 ] 00:12:39.218 }' 00:12:39.218 07:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.218 07:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:39.786 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:40.045 [2024-07-25 07:18:12.456026] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:40.045 [2024-07-25 07:18:12.456165] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1523580 00:12:40.045 [2024-07-25 07:18:12.456178] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:40.045 [2024-07-25 07:18:12.456336] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15249a0 00:12:40.045 [2024-07-25 07:18:12.456457] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1523580 00:12:40.045 [2024-07-25 07:18:12.456466] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1523580 00:12:40.045 [2024-07-25 07:18:12.456550] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:40.045 BaseBdev2 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:40.045 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:40.304 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:40.563 [ 00:12:40.563 { 00:12:40.563 "name": "BaseBdev2", 00:12:40.563 "aliases": [ 00:12:40.563 "c3950c47-c078-4a23-8386-abcd0dbf5cc7" 00:12:40.563 ], 00:12:40.563 "product_name": "Malloc disk", 00:12:40.563 "block_size": 512, 00:12:40.563 "num_blocks": 65536, 00:12:40.563 "uuid": "c3950c47-c078-4a23-8386-abcd0dbf5cc7", 00:12:40.563 "assigned_rate_limits": { 00:12:40.563 "rw_ios_per_sec": 0, 00:12:40.563 "rw_mbytes_per_sec": 0, 00:12:40.563 "r_mbytes_per_sec": 0, 00:12:40.563 "w_mbytes_per_sec": 0 00:12:40.563 }, 00:12:40.563 "claimed": true, 00:12:40.563 "claim_type": "exclusive_write", 00:12:40.563 "zoned": false, 00:12:40.563 "supported_io_types": { 00:12:40.563 "read": true, 00:12:40.563 "write": true, 00:12:40.563 "unmap": true, 00:12:40.563 "flush": true, 00:12:40.563 "reset": true, 00:12:40.563 "nvme_admin": false, 00:12:40.563 "nvme_io": false, 00:12:40.563 "nvme_io_md": false, 00:12:40.563 "write_zeroes": true, 00:12:40.563 "zcopy": true, 00:12:40.563 "get_zone_info": false, 00:12:40.563 "zone_management": false, 00:12:40.563 "zone_append": false, 00:12:40.563 "compare": false, 00:12:40.563 "compare_and_write": false, 00:12:40.563 "abort": true, 00:12:40.563 "seek_hole": false, 00:12:40.563 "seek_data": false, 00:12:40.563 "copy": true, 00:12:40.563 "nvme_iov_md": false 00:12:40.563 }, 00:12:40.563 "memory_domains": [ 00:12:40.563 { 00:12:40.563 "dma_device_id": "system", 00:12:40.563 "dma_device_type": 1 00:12:40.563 }, 00:12:40.563 { 00:12:40.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.563 "dma_device_type": 2 00:12:40.563 } 00:12:40.563 ], 00:12:40.563 "driver_specific": {} 00:12:40.563 } 00:12:40.563 ] 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.563 07:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.823 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.823 "name": "Existed_Raid", 00:12:40.823 "uuid": "3f02a5be-3126-4b39-bfe2-5bde08137249", 00:12:40.823 "strip_size_kb": 64, 00:12:40.823 "state": "online", 00:12:40.823 "raid_level": "raid0", 00:12:40.823 "superblock": true, 00:12:40.823 "num_base_bdevs": 2, 00:12:40.823 "num_base_bdevs_discovered": 2, 00:12:40.823 "num_base_bdevs_operational": 2, 00:12:40.823 "base_bdevs_list": [ 00:12:40.823 { 00:12:40.823 "name": "BaseBdev1", 00:12:40.823 "uuid": "c8a2c754-4c6c-4e38-9496-a96ac00f4843", 00:12:40.823 "is_configured": true, 00:12:40.823 "data_offset": 2048, 00:12:40.823 "data_size": 63488 00:12:40.823 }, 00:12:40.823 { 00:12:40.823 "name": "BaseBdev2", 00:12:40.823 "uuid": "c3950c47-c078-4a23-8386-abcd0dbf5cc7", 00:12:40.823 "is_configured": true, 00:12:40.823 "data_offset": 2048, 00:12:40.823 "data_size": 63488 00:12:40.823 } 00:12:40.823 ] 00:12:40.823 }' 00:12:40.823 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.823 07:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:41.390 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:41.649 [2024-07-25 07:18:13.932154] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:41.649 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:41.649 "name": "Existed_Raid", 00:12:41.649 "aliases": [ 00:12:41.649 "3f02a5be-3126-4b39-bfe2-5bde08137249" 00:12:41.649 ], 00:12:41.649 "product_name": "Raid Volume", 00:12:41.649 "block_size": 512, 00:12:41.649 "num_blocks": 126976, 00:12:41.649 "uuid": "3f02a5be-3126-4b39-bfe2-5bde08137249", 00:12:41.649 "assigned_rate_limits": { 00:12:41.649 "rw_ios_per_sec": 0, 00:12:41.649 "rw_mbytes_per_sec": 0, 00:12:41.649 "r_mbytes_per_sec": 0, 00:12:41.649 "w_mbytes_per_sec": 0 00:12:41.649 }, 00:12:41.649 "claimed": false, 00:12:41.649 "zoned": false, 00:12:41.649 "supported_io_types": { 00:12:41.649 "read": true, 00:12:41.649 "write": true, 00:12:41.649 "unmap": true, 00:12:41.649 "flush": true, 00:12:41.649 "reset": true, 00:12:41.649 "nvme_admin": false, 00:12:41.649 "nvme_io": false, 00:12:41.649 "nvme_io_md": false, 00:12:41.649 "write_zeroes": true, 00:12:41.649 "zcopy": false, 00:12:41.649 "get_zone_info": false, 00:12:41.649 "zone_management": false, 00:12:41.649 "zone_append": false, 00:12:41.649 "compare": false, 00:12:41.649 "compare_and_write": false, 00:12:41.649 "abort": false, 00:12:41.649 "seek_hole": false, 00:12:41.649 "seek_data": false, 00:12:41.649 "copy": false, 00:12:41.649 "nvme_iov_md": false 00:12:41.650 }, 00:12:41.650 "memory_domains": [ 00:12:41.650 { 00:12:41.650 "dma_device_id": "system", 00:12:41.650 "dma_device_type": 1 00:12:41.650 }, 00:12:41.650 { 00:12:41.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.650 "dma_device_type": 2 00:12:41.650 }, 00:12:41.650 { 00:12:41.650 "dma_device_id": "system", 00:12:41.650 "dma_device_type": 1 00:12:41.650 }, 00:12:41.650 { 00:12:41.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.650 "dma_device_type": 2 00:12:41.650 } 00:12:41.650 ], 00:12:41.650 "driver_specific": { 00:12:41.650 "raid": { 00:12:41.650 "uuid": "3f02a5be-3126-4b39-bfe2-5bde08137249", 00:12:41.650 "strip_size_kb": 64, 00:12:41.650 "state": "online", 00:12:41.650 "raid_level": "raid0", 00:12:41.650 "superblock": true, 00:12:41.650 "num_base_bdevs": 2, 00:12:41.650 "num_base_bdevs_discovered": 2, 00:12:41.650 "num_base_bdevs_operational": 2, 00:12:41.650 "base_bdevs_list": [ 00:12:41.650 { 00:12:41.650 "name": "BaseBdev1", 00:12:41.650 "uuid": "c8a2c754-4c6c-4e38-9496-a96ac00f4843", 00:12:41.650 "is_configured": true, 00:12:41.650 "data_offset": 2048, 00:12:41.650 "data_size": 63488 00:12:41.650 }, 00:12:41.650 { 00:12:41.650 "name": "BaseBdev2", 00:12:41.650 "uuid": "c3950c47-c078-4a23-8386-abcd0dbf5cc7", 00:12:41.650 "is_configured": true, 00:12:41.650 "data_offset": 2048, 00:12:41.650 "data_size": 63488 00:12:41.650 } 00:12:41.650 ] 00:12:41.650 } 00:12:41.650 } 00:12:41.650 }' 00:12:41.650 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:41.650 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:41.650 BaseBdev2' 00:12:41.650 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.650 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:41.650 07:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:41.650 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:41.650 "name": "BaseBdev1", 00:12:41.650 "aliases": [ 00:12:41.650 "c8a2c754-4c6c-4e38-9496-a96ac00f4843" 00:12:41.650 ], 00:12:41.650 "product_name": "Malloc disk", 00:12:41.650 "block_size": 512, 00:12:41.650 "num_blocks": 65536, 00:12:41.650 "uuid": "c8a2c754-4c6c-4e38-9496-a96ac00f4843", 00:12:41.650 "assigned_rate_limits": { 00:12:41.650 "rw_ios_per_sec": 0, 00:12:41.650 "rw_mbytes_per_sec": 0, 00:12:41.650 "r_mbytes_per_sec": 0, 00:12:41.650 "w_mbytes_per_sec": 0 00:12:41.650 }, 00:12:41.650 "claimed": true, 00:12:41.650 "claim_type": "exclusive_write", 00:12:41.650 "zoned": false, 00:12:41.650 "supported_io_types": { 00:12:41.650 "read": true, 00:12:41.650 "write": true, 00:12:41.650 "unmap": true, 00:12:41.650 "flush": true, 00:12:41.650 "reset": true, 00:12:41.650 "nvme_admin": false, 00:12:41.650 "nvme_io": false, 00:12:41.650 "nvme_io_md": false, 00:12:41.650 "write_zeroes": true, 00:12:41.650 "zcopy": true, 00:12:41.650 "get_zone_info": false, 00:12:41.650 "zone_management": false, 00:12:41.650 "zone_append": false, 00:12:41.650 "compare": false, 00:12:41.650 "compare_and_write": false, 00:12:41.650 "abort": true, 00:12:41.650 "seek_hole": false, 00:12:41.650 "seek_data": false, 00:12:41.650 "copy": true, 00:12:41.650 "nvme_iov_md": false 00:12:41.650 }, 00:12:41.650 "memory_domains": [ 00:12:41.650 { 00:12:41.650 "dma_device_id": "system", 00:12:41.650 "dma_device_type": 1 00:12:41.650 }, 00:12:41.650 { 00:12:41.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.650 "dma_device_type": 2 00:12:41.650 } 00:12:41.650 ], 00:12:41.650 "driver_specific": {} 00:12:41.650 }' 00:12:41.650 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.909 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.168 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.168 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.168 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:42.168 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:42.168 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.427 "name": "BaseBdev2", 00:12:42.427 "aliases": [ 00:12:42.427 "c3950c47-c078-4a23-8386-abcd0dbf5cc7" 00:12:42.427 ], 00:12:42.427 "product_name": "Malloc disk", 00:12:42.427 "block_size": 512, 00:12:42.427 "num_blocks": 65536, 00:12:42.427 "uuid": "c3950c47-c078-4a23-8386-abcd0dbf5cc7", 00:12:42.427 "assigned_rate_limits": { 00:12:42.427 "rw_ios_per_sec": 0, 00:12:42.427 "rw_mbytes_per_sec": 0, 00:12:42.427 "r_mbytes_per_sec": 0, 00:12:42.427 "w_mbytes_per_sec": 0 00:12:42.427 }, 00:12:42.427 "claimed": true, 00:12:42.427 "claim_type": "exclusive_write", 00:12:42.427 "zoned": false, 00:12:42.427 "supported_io_types": { 00:12:42.427 "read": true, 00:12:42.427 "write": true, 00:12:42.427 "unmap": true, 00:12:42.427 "flush": true, 00:12:42.427 "reset": true, 00:12:42.427 "nvme_admin": false, 00:12:42.427 "nvme_io": false, 00:12:42.427 "nvme_io_md": false, 00:12:42.427 "write_zeroes": true, 00:12:42.427 "zcopy": true, 00:12:42.427 "get_zone_info": false, 00:12:42.427 "zone_management": false, 00:12:42.427 "zone_append": false, 00:12:42.427 "compare": false, 00:12:42.427 "compare_and_write": false, 00:12:42.427 "abort": true, 00:12:42.427 "seek_hole": false, 00:12:42.427 "seek_data": false, 00:12:42.427 "copy": true, 00:12:42.427 "nvme_iov_md": false 00:12:42.427 }, 00:12:42.427 "memory_domains": [ 00:12:42.427 { 00:12:42.427 "dma_device_id": "system", 00:12:42.427 "dma_device_type": 1 00:12:42.427 }, 00:12:42.427 { 00:12:42.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.427 "dma_device_type": 2 00:12:42.427 } 00:12:42.427 ], 00:12:42.427 "driver_specific": {} 00:12:42.427 }' 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.427 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.686 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.686 07:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.686 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.686 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.686 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:42.946 [2024-07-25 07:18:15.291520] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:42.946 [2024-07-25 07:18:15.291544] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:42.946 [2024-07-25 07:18:15.291581] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.946 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.205 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.205 "name": "Existed_Raid", 00:12:43.205 "uuid": "3f02a5be-3126-4b39-bfe2-5bde08137249", 00:12:43.205 "strip_size_kb": 64, 00:12:43.205 "state": "offline", 00:12:43.205 "raid_level": "raid0", 00:12:43.205 "superblock": true, 00:12:43.205 "num_base_bdevs": 2, 00:12:43.205 "num_base_bdevs_discovered": 1, 00:12:43.205 "num_base_bdevs_operational": 1, 00:12:43.205 "base_bdevs_list": [ 00:12:43.205 { 00:12:43.205 "name": null, 00:12:43.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.205 "is_configured": false, 00:12:43.205 "data_offset": 2048, 00:12:43.205 "data_size": 63488 00:12:43.205 }, 00:12:43.205 { 00:12:43.205 "name": "BaseBdev2", 00:12:43.205 "uuid": "c3950c47-c078-4a23-8386-abcd0dbf5cc7", 00:12:43.205 "is_configured": true, 00:12:43.205 "data_offset": 2048, 00:12:43.205 "data_size": 63488 00:12:43.205 } 00:12:43.205 ] 00:12:43.205 }' 00:12:43.205 07:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.205 07:18:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.773 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:43.773 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:43.773 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.773 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:44.032 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:44.032 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:44.032 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:44.032 [2024-07-25 07:18:16.543874] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:44.032 [2024-07-25 07:18:16.543916] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1523580 name Existed_Raid, state offline 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1588904 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1588904 ']' 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1588904 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:44.291 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1588904 00:12:44.551 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:44.551 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:44.551 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1588904' 00:12:44.551 killing process with pid 1588904 00:12:44.551 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1588904 00:12:44.551 [2024-07-25 07:18:16.847798] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:44.551 07:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1588904 00:12:44.551 [2024-07-25 07:18:16.848670] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:44.551 07:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:44.551 00:12:44.551 real 0m9.993s 00:12:44.551 user 0m17.782s 00:12:44.551 sys 0m1.859s 00:12:44.551 07:18:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:44.551 07:18:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:44.551 ************************************ 00:12:44.551 END TEST raid_state_function_test_sb 00:12:44.551 ************************************ 00:12:44.551 07:18:17 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:12:44.551 07:18:17 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:12:44.551 07:18:17 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:44.551 07:18:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:44.811 ************************************ 00:12:44.811 START TEST raid_superblock_test 00:12:44.811 ************************************ 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 2 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1590842 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1590842 /var/tmp/spdk-raid.sock 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1590842 ']' 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:44.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:44.811 07:18:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.811 [2024-07-25 07:18:17.171690] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:44.811 [2024-07-25 07:18:17.171744] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1590842 ] 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.811 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.811 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.811 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.811 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.811 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.811 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:44.811 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:44.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:44.812 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:44.812 [2024-07-25 07:18:17.301783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:45.071 [2024-07-25 07:18:17.388485] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:45.072 [2024-07-25 07:18:17.448121] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.072 [2024-07-25 07:18:17.448166] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:45.639 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:45.898 malloc1 00:12:45.898 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:46.158 [2024-07-25 07:18:18.517716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:46.158 [2024-07-25 07:18:18.517761] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:46.158 [2024-07-25 07:18:18.517781] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b1280 00:12:46.158 [2024-07-25 07:18:18.517793] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:46.158 [2024-07-25 07:18:18.519416] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:46.158 [2024-07-25 07:18:18.519443] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:46.158 pt1 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:46.158 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:46.417 malloc2 00:12:46.417 07:18:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:46.676 [2024-07-25 07:18:18.979543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:46.676 [2024-07-25 07:18:18.979583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:46.676 [2024-07-25 07:18:18.979599] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275c8c0 00:12:46.676 [2024-07-25 07:18:18.979610] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:46.676 [2024-07-25 07:18:18.980904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:46.676 [2024-07-25 07:18:18.980929] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:46.676 pt2 00:12:46.676 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:46.676 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:46.676 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:46.934 [2024-07-25 07:18:19.220200] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:46.934 [2024-07-25 07:18:19.221347] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:46.934 [2024-07-25 07:18:19.221481] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x275a720 00:12:46.934 [2024-07-25 07:18:19.221493] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:46.934 [2024-07-25 07:18:19.221672] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b26e0 00:12:46.934 [2024-07-25 07:18:19.221799] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x275a720 00:12:46.934 [2024-07-25 07:18:19.221809] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x275a720 00:12:46.934 [2024-07-25 07:18:19.221897] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:46.934 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:46.934 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:46.934 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:46.934 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:46.934 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.934 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.935 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.935 "name": "raid_bdev1", 00:12:46.935 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:46.935 "strip_size_kb": 64, 00:12:46.935 "state": "online", 00:12:46.935 "raid_level": "raid0", 00:12:46.935 "superblock": true, 00:12:46.935 "num_base_bdevs": 2, 00:12:46.935 "num_base_bdevs_discovered": 2, 00:12:46.935 "num_base_bdevs_operational": 2, 00:12:46.935 "base_bdevs_list": [ 00:12:46.935 { 00:12:46.935 "name": "pt1", 00:12:46.935 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:46.935 "is_configured": true, 00:12:46.935 "data_offset": 2048, 00:12:46.935 "data_size": 63488 00:12:46.935 }, 00:12:46.935 { 00:12:46.935 "name": "pt2", 00:12:46.935 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:46.935 "is_configured": true, 00:12:46.935 "data_offset": 2048, 00:12:46.935 "data_size": 63488 00:12:46.935 } 00:12:46.935 ] 00:12:46.935 }' 00:12:47.193 07:18:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.193 07:18:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:47.761 [2024-07-25 07:18:20.190964] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:47.761 "name": "raid_bdev1", 00:12:47.761 "aliases": [ 00:12:47.761 "52c101c4-5f55-44e7-b19a-3ac0ea080e92" 00:12:47.761 ], 00:12:47.761 "product_name": "Raid Volume", 00:12:47.761 "block_size": 512, 00:12:47.761 "num_blocks": 126976, 00:12:47.761 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:47.761 "assigned_rate_limits": { 00:12:47.761 "rw_ios_per_sec": 0, 00:12:47.761 "rw_mbytes_per_sec": 0, 00:12:47.761 "r_mbytes_per_sec": 0, 00:12:47.761 "w_mbytes_per_sec": 0 00:12:47.761 }, 00:12:47.761 "claimed": false, 00:12:47.761 "zoned": false, 00:12:47.761 "supported_io_types": { 00:12:47.761 "read": true, 00:12:47.761 "write": true, 00:12:47.761 "unmap": true, 00:12:47.761 "flush": true, 00:12:47.761 "reset": true, 00:12:47.761 "nvme_admin": false, 00:12:47.761 "nvme_io": false, 00:12:47.761 "nvme_io_md": false, 00:12:47.761 "write_zeroes": true, 00:12:47.761 "zcopy": false, 00:12:47.761 "get_zone_info": false, 00:12:47.761 "zone_management": false, 00:12:47.761 "zone_append": false, 00:12:47.761 "compare": false, 00:12:47.761 "compare_and_write": false, 00:12:47.761 "abort": false, 00:12:47.761 "seek_hole": false, 00:12:47.761 "seek_data": false, 00:12:47.761 "copy": false, 00:12:47.761 "nvme_iov_md": false 00:12:47.761 }, 00:12:47.761 "memory_domains": [ 00:12:47.761 { 00:12:47.761 "dma_device_id": "system", 00:12:47.761 "dma_device_type": 1 00:12:47.761 }, 00:12:47.761 { 00:12:47.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.761 "dma_device_type": 2 00:12:47.761 }, 00:12:47.761 { 00:12:47.761 "dma_device_id": "system", 00:12:47.761 "dma_device_type": 1 00:12:47.761 }, 00:12:47.761 { 00:12:47.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.761 "dma_device_type": 2 00:12:47.761 } 00:12:47.761 ], 00:12:47.761 "driver_specific": { 00:12:47.761 "raid": { 00:12:47.761 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:47.761 "strip_size_kb": 64, 00:12:47.761 "state": "online", 00:12:47.761 "raid_level": "raid0", 00:12:47.761 "superblock": true, 00:12:47.761 "num_base_bdevs": 2, 00:12:47.761 "num_base_bdevs_discovered": 2, 00:12:47.761 "num_base_bdevs_operational": 2, 00:12:47.761 "base_bdevs_list": [ 00:12:47.761 { 00:12:47.761 "name": "pt1", 00:12:47.761 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:47.761 "is_configured": true, 00:12:47.761 "data_offset": 2048, 00:12:47.761 "data_size": 63488 00:12:47.761 }, 00:12:47.761 { 00:12:47.761 "name": "pt2", 00:12:47.761 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:47.761 "is_configured": true, 00:12:47.761 "data_offset": 2048, 00:12:47.761 "data_size": 63488 00:12:47.761 } 00:12:47.761 ] 00:12:47.761 } 00:12:47.761 } 00:12:47.761 }' 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:47.761 pt2' 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:47.761 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.020 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:48.020 "name": "pt1", 00:12:48.020 "aliases": [ 00:12:48.020 "00000000-0000-0000-0000-000000000001" 00:12:48.020 ], 00:12:48.020 "product_name": "passthru", 00:12:48.020 "block_size": 512, 00:12:48.020 "num_blocks": 65536, 00:12:48.020 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:48.020 "assigned_rate_limits": { 00:12:48.020 "rw_ios_per_sec": 0, 00:12:48.020 "rw_mbytes_per_sec": 0, 00:12:48.020 "r_mbytes_per_sec": 0, 00:12:48.020 "w_mbytes_per_sec": 0 00:12:48.020 }, 00:12:48.020 "claimed": true, 00:12:48.020 "claim_type": "exclusive_write", 00:12:48.020 "zoned": false, 00:12:48.020 "supported_io_types": { 00:12:48.020 "read": true, 00:12:48.020 "write": true, 00:12:48.020 "unmap": true, 00:12:48.020 "flush": true, 00:12:48.020 "reset": true, 00:12:48.020 "nvme_admin": false, 00:12:48.020 "nvme_io": false, 00:12:48.020 "nvme_io_md": false, 00:12:48.020 "write_zeroes": true, 00:12:48.020 "zcopy": true, 00:12:48.020 "get_zone_info": false, 00:12:48.020 "zone_management": false, 00:12:48.020 "zone_append": false, 00:12:48.020 "compare": false, 00:12:48.020 "compare_and_write": false, 00:12:48.020 "abort": true, 00:12:48.020 "seek_hole": false, 00:12:48.020 "seek_data": false, 00:12:48.020 "copy": true, 00:12:48.020 "nvme_iov_md": false 00:12:48.020 }, 00:12:48.020 "memory_domains": [ 00:12:48.020 { 00:12:48.020 "dma_device_id": "system", 00:12:48.020 "dma_device_type": 1 00:12:48.020 }, 00:12:48.020 { 00:12:48.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.020 "dma_device_type": 2 00:12:48.020 } 00:12:48.020 ], 00:12:48.020 "driver_specific": { 00:12:48.020 "passthru": { 00:12:48.021 "name": "pt1", 00:12:48.021 "base_bdev_name": "malloc1" 00:12:48.021 } 00:12:48.021 } 00:12:48.021 }' 00:12:48.021 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.021 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.280 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.538 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.538 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:48.538 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:48.538 07:18:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.538 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:48.538 "name": "pt2", 00:12:48.538 "aliases": [ 00:12:48.538 "00000000-0000-0000-0000-000000000002" 00:12:48.538 ], 00:12:48.538 "product_name": "passthru", 00:12:48.538 "block_size": 512, 00:12:48.538 "num_blocks": 65536, 00:12:48.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.538 "assigned_rate_limits": { 00:12:48.538 "rw_ios_per_sec": 0, 00:12:48.538 "rw_mbytes_per_sec": 0, 00:12:48.538 "r_mbytes_per_sec": 0, 00:12:48.538 "w_mbytes_per_sec": 0 00:12:48.538 }, 00:12:48.538 "claimed": true, 00:12:48.538 "claim_type": "exclusive_write", 00:12:48.538 "zoned": false, 00:12:48.538 "supported_io_types": { 00:12:48.538 "read": true, 00:12:48.538 "write": true, 00:12:48.538 "unmap": true, 00:12:48.538 "flush": true, 00:12:48.538 "reset": true, 00:12:48.538 "nvme_admin": false, 00:12:48.538 "nvme_io": false, 00:12:48.538 "nvme_io_md": false, 00:12:48.538 "write_zeroes": true, 00:12:48.538 "zcopy": true, 00:12:48.538 "get_zone_info": false, 00:12:48.538 "zone_management": false, 00:12:48.538 "zone_append": false, 00:12:48.538 "compare": false, 00:12:48.538 "compare_and_write": false, 00:12:48.538 "abort": true, 00:12:48.539 "seek_hole": false, 00:12:48.539 "seek_data": false, 00:12:48.539 "copy": true, 00:12:48.539 "nvme_iov_md": false 00:12:48.539 }, 00:12:48.539 "memory_domains": [ 00:12:48.539 { 00:12:48.539 "dma_device_id": "system", 00:12:48.539 "dma_device_type": 1 00:12:48.539 }, 00:12:48.539 { 00:12:48.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.539 "dma_device_type": 2 00:12:48.539 } 00:12:48.539 ], 00:12:48.539 "driver_specific": { 00:12:48.539 "passthru": { 00:12:48.539 "name": "pt2", 00:12:48.539 "base_bdev_name": "malloc2" 00:12:48.539 } 00:12:48.539 } 00:12:48.539 }' 00:12:48.539 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.800 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.087 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.087 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:49.087 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:49.087 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:12:49.345 [2024-07-25 07:18:21.626724] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:49.345 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=52c101c4-5f55-44e7-b19a-3ac0ea080e92 00:12:49.345 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 52c101c4-5f55-44e7-b19a-3ac0ea080e92 ']' 00:12:49.345 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:49.345 [2024-07-25 07:18:21.855094] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:49.345 [2024-07-25 07:18:21.855112] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:49.345 [2024-07-25 07:18:21.855164] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.345 [2024-07-25 07:18:21.855203] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:49.346 [2024-07-25 07:18:21.855213] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x275a720 name raid_bdev1, state offline 00:12:49.346 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.346 07:18:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:12:49.604 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:12:49.604 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:12:49.604 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:49.604 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:49.864 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:49.864 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:50.123 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:50.123 07:18:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.690 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:50.691 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.691 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:50.691 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.691 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:50.691 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:50.691 [2024-07-25 07:18:23.218631] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:50.691 [2024-07-25 07:18:23.219874] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:50.691 [2024-07-25 07:18:23.219925] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:50.691 [2024-07-25 07:18:23.219962] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:50.691 [2024-07-25 07:18:23.219979] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:50.691 [2024-07-25 07:18:23.219988] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x275b2b0 name raid_bdev1, state configuring 00:12:50.691 request: 00:12:50.691 { 00:12:50.691 "name": "raid_bdev1", 00:12:50.691 "raid_level": "raid0", 00:12:50.691 "base_bdevs": [ 00:12:50.691 "malloc1", 00:12:50.691 "malloc2" 00:12:50.691 ], 00:12:50.691 "strip_size_kb": 64, 00:12:50.691 "superblock": false, 00:12:50.691 "method": "bdev_raid_create", 00:12:50.691 "req_id": 1 00:12:50.691 } 00:12:50.691 Got JSON-RPC error response 00:12:50.691 response: 00:12:50.691 { 00:12:50.691 "code": -17, 00:12:50.691 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:50.691 } 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:12:50.954 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:51.212 [2024-07-25 07:18:23.595569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:51.212 [2024-07-25 07:18:23.595606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.212 [2024-07-25 07:18:23.595622] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275a490 00:12:51.212 [2024-07-25 07:18:23.595633] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.212 [2024-07-25 07:18:23.597083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.212 [2024-07-25 07:18:23.597109] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:51.212 [2024-07-25 07:18:23.597173] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:51.212 [2024-07-25 07:18:23.597196] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:51.212 pt1 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.212 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.471 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.471 "name": "raid_bdev1", 00:12:51.471 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:51.471 "strip_size_kb": 64, 00:12:51.471 "state": "configuring", 00:12:51.471 "raid_level": "raid0", 00:12:51.471 "superblock": true, 00:12:51.471 "num_base_bdevs": 2, 00:12:51.471 "num_base_bdevs_discovered": 1, 00:12:51.471 "num_base_bdevs_operational": 2, 00:12:51.471 "base_bdevs_list": [ 00:12:51.471 { 00:12:51.471 "name": "pt1", 00:12:51.471 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:51.471 "is_configured": true, 00:12:51.471 "data_offset": 2048, 00:12:51.471 "data_size": 63488 00:12:51.471 }, 00:12:51.471 { 00:12:51.471 "name": null, 00:12:51.471 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:51.471 "is_configured": false, 00:12:51.471 "data_offset": 2048, 00:12:51.471 "data_size": 63488 00:12:51.471 } 00:12:51.471 ] 00:12:51.471 }' 00:12:51.471 07:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.471 07:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.038 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:12:52.038 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:12:52.038 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:12:52.038 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:52.296 [2024-07-25 07:18:24.614261] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:52.296 [2024-07-25 07:18:24.614303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:52.296 [2024-07-25 07:18:24.614320] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275bfa0 00:12:52.296 [2024-07-25 07:18:24.614331] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:52.296 [2024-07-25 07:18:24.614630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:52.296 [2024-07-25 07:18:24.614646] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:52.296 [2024-07-25 07:18:24.614705] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:52.296 [2024-07-25 07:18:24.614721] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:52.296 [2024-07-25 07:18:24.614808] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b0220 00:12:52.296 [2024-07-25 07:18:24.614817] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:52.296 [2024-07-25 07:18:24.614971] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x275d7e0 00:12:52.296 [2024-07-25 07:18:24.615081] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b0220 00:12:52.296 [2024-07-25 07:18:24.615090] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25b0220 00:12:52.296 [2024-07-25 07:18:24.615188] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.296 pt2 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.296 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:52.555 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.555 "name": "raid_bdev1", 00:12:52.555 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:52.555 "strip_size_kb": 64, 00:12:52.555 "state": "online", 00:12:52.555 "raid_level": "raid0", 00:12:52.555 "superblock": true, 00:12:52.555 "num_base_bdevs": 2, 00:12:52.555 "num_base_bdevs_discovered": 2, 00:12:52.555 "num_base_bdevs_operational": 2, 00:12:52.555 "base_bdevs_list": [ 00:12:52.555 { 00:12:52.555 "name": "pt1", 00:12:52.555 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:52.555 "is_configured": true, 00:12:52.555 "data_offset": 2048, 00:12:52.555 "data_size": 63488 00:12:52.555 }, 00:12:52.555 { 00:12:52.555 "name": "pt2", 00:12:52.555 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:52.555 "is_configured": true, 00:12:52.555 "data_offset": 2048, 00:12:52.555 "data_size": 63488 00:12:52.555 } 00:12:52.555 ] 00:12:52.555 }' 00:12:52.555 07:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.555 07:18:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:53.122 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:53.381 [2024-07-25 07:18:25.657236] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:53.381 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:53.381 "name": "raid_bdev1", 00:12:53.381 "aliases": [ 00:12:53.381 "52c101c4-5f55-44e7-b19a-3ac0ea080e92" 00:12:53.381 ], 00:12:53.381 "product_name": "Raid Volume", 00:12:53.381 "block_size": 512, 00:12:53.381 "num_blocks": 126976, 00:12:53.381 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:53.381 "assigned_rate_limits": { 00:12:53.381 "rw_ios_per_sec": 0, 00:12:53.381 "rw_mbytes_per_sec": 0, 00:12:53.381 "r_mbytes_per_sec": 0, 00:12:53.381 "w_mbytes_per_sec": 0 00:12:53.381 }, 00:12:53.381 "claimed": false, 00:12:53.381 "zoned": false, 00:12:53.381 "supported_io_types": { 00:12:53.381 "read": true, 00:12:53.381 "write": true, 00:12:53.381 "unmap": true, 00:12:53.381 "flush": true, 00:12:53.381 "reset": true, 00:12:53.381 "nvme_admin": false, 00:12:53.381 "nvme_io": false, 00:12:53.381 "nvme_io_md": false, 00:12:53.381 "write_zeroes": true, 00:12:53.381 "zcopy": false, 00:12:53.381 "get_zone_info": false, 00:12:53.381 "zone_management": false, 00:12:53.381 "zone_append": false, 00:12:53.381 "compare": false, 00:12:53.381 "compare_and_write": false, 00:12:53.381 "abort": false, 00:12:53.381 "seek_hole": false, 00:12:53.381 "seek_data": false, 00:12:53.381 "copy": false, 00:12:53.381 "nvme_iov_md": false 00:12:53.381 }, 00:12:53.381 "memory_domains": [ 00:12:53.381 { 00:12:53.381 "dma_device_id": "system", 00:12:53.381 "dma_device_type": 1 00:12:53.381 }, 00:12:53.381 { 00:12:53.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.381 "dma_device_type": 2 00:12:53.381 }, 00:12:53.381 { 00:12:53.381 "dma_device_id": "system", 00:12:53.381 "dma_device_type": 1 00:12:53.381 }, 00:12:53.381 { 00:12:53.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.381 "dma_device_type": 2 00:12:53.381 } 00:12:53.381 ], 00:12:53.381 "driver_specific": { 00:12:53.381 "raid": { 00:12:53.381 "uuid": "52c101c4-5f55-44e7-b19a-3ac0ea080e92", 00:12:53.381 "strip_size_kb": 64, 00:12:53.381 "state": "online", 00:12:53.381 "raid_level": "raid0", 00:12:53.381 "superblock": true, 00:12:53.381 "num_base_bdevs": 2, 00:12:53.381 "num_base_bdevs_discovered": 2, 00:12:53.381 "num_base_bdevs_operational": 2, 00:12:53.381 "base_bdevs_list": [ 00:12:53.381 { 00:12:53.381 "name": "pt1", 00:12:53.381 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:53.381 "is_configured": true, 00:12:53.381 "data_offset": 2048, 00:12:53.381 "data_size": 63488 00:12:53.381 }, 00:12:53.381 { 00:12:53.381 "name": "pt2", 00:12:53.381 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:53.381 "is_configured": true, 00:12:53.381 "data_offset": 2048, 00:12:53.381 "data_size": 63488 00:12:53.381 } 00:12:53.381 ] 00:12:53.381 } 00:12:53.381 } 00:12:53.381 }' 00:12:53.381 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:53.381 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:53.381 pt2' 00:12:53.381 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.381 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:53.381 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.640 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.640 "name": "pt1", 00:12:53.640 "aliases": [ 00:12:53.640 "00000000-0000-0000-0000-000000000001" 00:12:53.640 ], 00:12:53.640 "product_name": "passthru", 00:12:53.640 "block_size": 512, 00:12:53.640 "num_blocks": 65536, 00:12:53.640 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:53.640 "assigned_rate_limits": { 00:12:53.640 "rw_ios_per_sec": 0, 00:12:53.640 "rw_mbytes_per_sec": 0, 00:12:53.640 "r_mbytes_per_sec": 0, 00:12:53.640 "w_mbytes_per_sec": 0 00:12:53.640 }, 00:12:53.640 "claimed": true, 00:12:53.640 "claim_type": "exclusive_write", 00:12:53.640 "zoned": false, 00:12:53.640 "supported_io_types": { 00:12:53.640 "read": true, 00:12:53.640 "write": true, 00:12:53.640 "unmap": true, 00:12:53.640 "flush": true, 00:12:53.640 "reset": true, 00:12:53.640 "nvme_admin": false, 00:12:53.640 "nvme_io": false, 00:12:53.640 "nvme_io_md": false, 00:12:53.640 "write_zeroes": true, 00:12:53.640 "zcopy": true, 00:12:53.640 "get_zone_info": false, 00:12:53.640 "zone_management": false, 00:12:53.640 "zone_append": false, 00:12:53.640 "compare": false, 00:12:53.640 "compare_and_write": false, 00:12:53.640 "abort": true, 00:12:53.640 "seek_hole": false, 00:12:53.640 "seek_data": false, 00:12:53.640 "copy": true, 00:12:53.640 "nvme_iov_md": false 00:12:53.640 }, 00:12:53.640 "memory_domains": [ 00:12:53.640 { 00:12:53.640 "dma_device_id": "system", 00:12:53.640 "dma_device_type": 1 00:12:53.640 }, 00:12:53.640 { 00:12:53.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.640 "dma_device_type": 2 00:12:53.640 } 00:12:53.640 ], 00:12:53.640 "driver_specific": { 00:12:53.640 "passthru": { 00:12:53.640 "name": "pt1", 00:12:53.640 "base_bdev_name": "malloc1" 00:12:53.640 } 00:12:53.640 } 00:12:53.640 }' 00:12:53.640 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.640 07:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.640 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.640 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.640 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.640 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.640 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.640 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:53.899 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:54.157 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:54.157 "name": "pt2", 00:12:54.157 "aliases": [ 00:12:54.157 "00000000-0000-0000-0000-000000000002" 00:12:54.157 ], 00:12:54.157 "product_name": "passthru", 00:12:54.157 "block_size": 512, 00:12:54.157 "num_blocks": 65536, 00:12:54.157 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.157 "assigned_rate_limits": { 00:12:54.157 "rw_ios_per_sec": 0, 00:12:54.157 "rw_mbytes_per_sec": 0, 00:12:54.157 "r_mbytes_per_sec": 0, 00:12:54.157 "w_mbytes_per_sec": 0 00:12:54.157 }, 00:12:54.157 "claimed": true, 00:12:54.157 "claim_type": "exclusive_write", 00:12:54.157 "zoned": false, 00:12:54.157 "supported_io_types": { 00:12:54.157 "read": true, 00:12:54.157 "write": true, 00:12:54.157 "unmap": true, 00:12:54.157 "flush": true, 00:12:54.157 "reset": true, 00:12:54.157 "nvme_admin": false, 00:12:54.157 "nvme_io": false, 00:12:54.157 "nvme_io_md": false, 00:12:54.157 "write_zeroes": true, 00:12:54.157 "zcopy": true, 00:12:54.157 "get_zone_info": false, 00:12:54.157 "zone_management": false, 00:12:54.157 "zone_append": false, 00:12:54.157 "compare": false, 00:12:54.157 "compare_and_write": false, 00:12:54.157 "abort": true, 00:12:54.157 "seek_hole": false, 00:12:54.157 "seek_data": false, 00:12:54.157 "copy": true, 00:12:54.157 "nvme_iov_md": false 00:12:54.157 }, 00:12:54.157 "memory_domains": [ 00:12:54.157 { 00:12:54.157 "dma_device_id": "system", 00:12:54.157 "dma_device_type": 1 00:12:54.157 }, 00:12:54.157 { 00:12:54.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.157 "dma_device_type": 2 00:12:54.157 } 00:12:54.157 ], 00:12:54.157 "driver_specific": { 00:12:54.157 "passthru": { 00:12:54.157 "name": "pt2", 00:12:54.157 "base_bdev_name": "malloc2" 00:12:54.157 } 00:12:54.157 } 00:12:54.157 }' 00:12:54.157 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.157 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.157 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:54.157 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.157 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:54.416 07:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:12:54.676 [2024-07-25 07:18:27.072966] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 52c101c4-5f55-44e7-b19a-3ac0ea080e92 '!=' 52c101c4-5f55-44e7-b19a-3ac0ea080e92 ']' 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1590842 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1590842 ']' 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1590842 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1590842 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1590842' 00:12:54.676 killing process with pid 1590842 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1590842 00:12:54.676 [2024-07-25 07:18:27.152067] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:54.676 [2024-07-25 07:18:27.152122] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.676 [2024-07-25 07:18:27.152169] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.676 [2024-07-25 07:18:27.152181] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b0220 name raid_bdev1, state offline 00:12:54.676 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1590842 00:12:54.676 [2024-07-25 07:18:27.168064] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:54.935 07:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:12:54.935 00:12:54.935 real 0m10.246s 00:12:54.935 user 0m18.296s 00:12:54.935 sys 0m1.917s 00:12:54.935 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:54.935 07:18:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.935 ************************************ 00:12:54.935 END TEST raid_superblock_test 00:12:54.935 ************************************ 00:12:54.935 07:18:27 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:12:54.935 07:18:27 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:54.935 07:18:27 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:54.935 07:18:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:54.935 ************************************ 00:12:54.935 START TEST raid_read_error_test 00:12:54.935 ************************************ 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 read 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.Lfwp4MhwDl 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1592907 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1592907 /var/tmp/spdk-raid.sock 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:54.935 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1592907 ']' 00:12:54.936 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:54.936 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:54.936 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:54.936 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:54.936 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:54.936 07:18:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.195 [2024-07-25 07:18:27.515260] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:12:55.195 [2024-07-25 07:18:27.515315] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1592907 ] 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:55.195 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:55.195 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:55.195 [2024-07-25 07:18:27.647062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.455 [2024-07-25 07:18:27.733414] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.455 [2024-07-25 07:18:27.793484] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.455 [2024-07-25 07:18:27.793524] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.023 07:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:56.023 07:18:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:56.023 07:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:56.023 07:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:56.282 BaseBdev1_malloc 00:12:56.282 07:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:56.540 true 00:12:56.541 07:18:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:56.799 [2024-07-25 07:18:29.079339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:56.800 [2024-07-25 07:18:29.079378] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:56.800 [2024-07-25 07:18:29.079397] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a3a50 00:12:56.800 [2024-07-25 07:18:29.079409] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:56.800 [2024-07-25 07:18:29.080892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:56.800 [2024-07-25 07:18:29.080919] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:56.800 BaseBdev1 00:12:56.800 07:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:56.800 07:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:56.800 BaseBdev2_malloc 00:12:56.800 07:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:57.058 true 00:12:57.058 07:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:57.318 [2024-07-25 07:18:29.761498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:57.318 [2024-07-25 07:18:29.761536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:57.318 [2024-07-25 07:18:29.761554] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x284cf40 00:12:57.318 [2024-07-25 07:18:29.761566] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:57.318 [2024-07-25 07:18:29.762931] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:57.318 [2024-07-25 07:18:29.762958] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:57.318 BaseBdev2 00:12:57.318 07:18:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:57.577 [2024-07-25 07:18:29.986112] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:57.577 [2024-07-25 07:18:29.987302] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:57.577 [2024-07-25 07:18:29.987481] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x284f860 00:12:57.577 [2024-07-25 07:18:29.987494] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:57.577 [2024-07-25 07:18:29.987666] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x284f280 00:12:57.577 [2024-07-25 07:18:29.987798] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x284f860 00:12:57.577 [2024-07-25 07:18:29.987807] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x284f860 00:12:57.577 [2024-07-25 07:18:29.987900] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.577 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.578 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.578 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:57.837 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.837 "name": "raid_bdev1", 00:12:57.837 "uuid": "334ac68c-ce63-4d1c-bab7-bc8df51b5de8", 00:12:57.837 "strip_size_kb": 64, 00:12:57.837 "state": "online", 00:12:57.837 "raid_level": "raid0", 00:12:57.837 "superblock": true, 00:12:57.837 "num_base_bdevs": 2, 00:12:57.837 "num_base_bdevs_discovered": 2, 00:12:57.837 "num_base_bdevs_operational": 2, 00:12:57.837 "base_bdevs_list": [ 00:12:57.837 { 00:12:57.837 "name": "BaseBdev1", 00:12:57.837 "uuid": "8f4cef53-80bd-5696-8a98-01836079a805", 00:12:57.837 "is_configured": true, 00:12:57.837 "data_offset": 2048, 00:12:57.837 "data_size": 63488 00:12:57.837 }, 00:12:57.837 { 00:12:57.837 "name": "BaseBdev2", 00:12:57.837 "uuid": "365b1e2e-2009-5da5-80b1-7435b976cc58", 00:12:57.837 "is_configured": true, 00:12:57.837 "data_offset": 2048, 00:12:57.837 "data_size": 63488 00:12:57.837 } 00:12:57.837 ] 00:12:57.837 }' 00:12:57.837 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.837 07:18:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.405 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:58.405 07:18:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:58.405 [2024-07-25 07:18:30.929078] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x284f280 00:12:59.343 07:18:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:59.602 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:59.602 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:59.602 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:59.602 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:59.602 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.602 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.603 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.862 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.862 "name": "raid_bdev1", 00:12:59.862 "uuid": "334ac68c-ce63-4d1c-bab7-bc8df51b5de8", 00:12:59.862 "strip_size_kb": 64, 00:12:59.862 "state": "online", 00:12:59.862 "raid_level": "raid0", 00:12:59.862 "superblock": true, 00:12:59.862 "num_base_bdevs": 2, 00:12:59.862 "num_base_bdevs_discovered": 2, 00:12:59.862 "num_base_bdevs_operational": 2, 00:12:59.862 "base_bdevs_list": [ 00:12:59.862 { 00:12:59.862 "name": "BaseBdev1", 00:12:59.862 "uuid": "8f4cef53-80bd-5696-8a98-01836079a805", 00:12:59.862 "is_configured": true, 00:12:59.862 "data_offset": 2048, 00:12:59.862 "data_size": 63488 00:12:59.862 }, 00:12:59.862 { 00:12:59.862 "name": "BaseBdev2", 00:12:59.862 "uuid": "365b1e2e-2009-5da5-80b1-7435b976cc58", 00:12:59.862 "is_configured": true, 00:12:59.862 "data_offset": 2048, 00:12:59.862 "data_size": 63488 00:12:59.862 } 00:12:59.862 ] 00:12:59.862 }' 00:12:59.862 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.862 07:18:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.430 07:18:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:00.689 [2024-07-25 07:18:33.001947] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:00.689 [2024-07-25 07:18:33.001985] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:00.689 [2024-07-25 07:18:33.004888] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:00.689 [2024-07-25 07:18:33.004915] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.689 [2024-07-25 07:18:33.004940] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:00.689 [2024-07-25 07:18:33.004951] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x284f860 name raid_bdev1, state offline 00:13:00.689 0 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1592907 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1592907 ']' 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1592907 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1592907 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1592907' 00:13:00.689 killing process with pid 1592907 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1592907 00:13:00.689 [2024-07-25 07:18:33.075983] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:00.689 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1592907 00:13:00.689 [2024-07-25 07:18:33.085901] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:00.948 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.Lfwp4MhwDl 00:13:00.948 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:00.948 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:00.948 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.48 00:13:00.949 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:13:00.949 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:00.949 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:00.949 07:18:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.48 != \0\.\0\0 ]] 00:13:00.949 00:13:00.949 real 0m5.850s 00:13:00.949 user 0m9.050s 00:13:00.949 sys 0m1.042s 00:13:00.949 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:00.949 07:18:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.949 ************************************ 00:13:00.949 END TEST raid_read_error_test 00:13:00.949 ************************************ 00:13:00.949 07:18:33 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:13:00.949 07:18:33 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:00.949 07:18:33 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:00.949 07:18:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:00.949 ************************************ 00:13:00.949 START TEST raid_write_error_test 00:13:00.949 ************************************ 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 write 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.ZmwHamOikj 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1593937 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1593937 /var/tmp/spdk-raid.sock 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1593937 ']' 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:00.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:00.949 07:18:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.949 [2024-07-25 07:18:33.442785] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:00.949 [2024-07-25 07:18:33.442843] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593937 ] 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:01.208 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:01.208 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:01.208 [2024-07-25 07:18:33.574135] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:01.208 [2024-07-25 07:18:33.664278] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:01.208 [2024-07-25 07:18:33.717392] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:01.208 [2024-07-25 07:18:33.717419] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.144 07:18:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:02.144 07:18:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:02.144 07:18:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:02.144 07:18:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:02.144 BaseBdev1_malloc 00:13:02.144 07:18:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:02.403 true 00:13:02.403 07:18:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:02.662 [2024-07-25 07:18:34.997841] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:02.662 [2024-07-25 07:18:34.997880] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.662 [2024-07-25 07:18:34.997898] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20daa50 00:13:02.662 [2024-07-25 07:18:34.997910] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.662 [2024-07-25 07:18:34.999496] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.662 [2024-07-25 07:18:34.999523] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:02.662 BaseBdev1 00:13:02.662 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:02.662 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:02.920 BaseBdev2_malloc 00:13:02.920 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:02.920 true 00:13:02.920 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:03.180 [2024-07-25 07:18:35.655840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:03.180 [2024-07-25 07:18:35.655880] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.180 [2024-07-25 07:18:35.655897] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2283f40 00:13:03.180 [2024-07-25 07:18:35.655909] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.180 [2024-07-25 07:18:35.657296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.180 [2024-07-25 07:18:35.657323] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:03.180 BaseBdev2 00:13:03.180 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:03.473 [2024-07-25 07:18:35.880457] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.473 [2024-07-25 07:18:35.881644] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:03.473 [2024-07-25 07:18:35.881822] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2286860 00:13:03.473 [2024-07-25 07:18:35.881834] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:03.473 [2024-07-25 07:18:35.882009] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2286280 00:13:03.473 [2024-07-25 07:18:35.882149] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2286860 00:13:03.473 [2024-07-25 07:18:35.882160] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2286860 00:13:03.473 [2024-07-25 07:18:35.882254] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.473 07:18:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:03.731 07:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.731 "name": "raid_bdev1", 00:13:03.731 "uuid": "cea7883d-9c07-4c2e-83b6-db68c3119195", 00:13:03.731 "strip_size_kb": 64, 00:13:03.731 "state": "online", 00:13:03.731 "raid_level": "raid0", 00:13:03.731 "superblock": true, 00:13:03.731 "num_base_bdevs": 2, 00:13:03.731 "num_base_bdevs_discovered": 2, 00:13:03.731 "num_base_bdevs_operational": 2, 00:13:03.731 "base_bdevs_list": [ 00:13:03.731 { 00:13:03.731 "name": "BaseBdev1", 00:13:03.731 "uuid": "e157c033-74df-534d-ae06-6547d69fa641", 00:13:03.731 "is_configured": true, 00:13:03.731 "data_offset": 2048, 00:13:03.731 "data_size": 63488 00:13:03.731 }, 00:13:03.731 { 00:13:03.731 "name": "BaseBdev2", 00:13:03.731 "uuid": "4616737f-00bf-5ff0-af78-569682e0dd76", 00:13:03.731 "is_configured": true, 00:13:03.731 "data_offset": 2048, 00:13:03.732 "data_size": 63488 00:13:03.732 } 00:13:03.732 ] 00:13:03.732 }' 00:13:03.732 07:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.732 07:18:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.298 07:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:04.298 07:18:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:04.298 [2024-07-25 07:18:36.767019] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2286280 00:13:05.234 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.492 07:18:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:05.751 07:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.751 "name": "raid_bdev1", 00:13:05.751 "uuid": "cea7883d-9c07-4c2e-83b6-db68c3119195", 00:13:05.751 "strip_size_kb": 64, 00:13:05.751 "state": "online", 00:13:05.751 "raid_level": "raid0", 00:13:05.751 "superblock": true, 00:13:05.751 "num_base_bdevs": 2, 00:13:05.751 "num_base_bdevs_discovered": 2, 00:13:05.751 "num_base_bdevs_operational": 2, 00:13:05.751 "base_bdevs_list": [ 00:13:05.751 { 00:13:05.751 "name": "BaseBdev1", 00:13:05.751 "uuid": "e157c033-74df-534d-ae06-6547d69fa641", 00:13:05.751 "is_configured": true, 00:13:05.751 "data_offset": 2048, 00:13:05.751 "data_size": 63488 00:13:05.751 }, 00:13:05.751 { 00:13:05.751 "name": "BaseBdev2", 00:13:05.751 "uuid": "4616737f-00bf-5ff0-af78-569682e0dd76", 00:13:05.751 "is_configured": true, 00:13:05.751 "data_offset": 2048, 00:13:05.751 "data_size": 63488 00:13:05.751 } 00:13:05.751 ] 00:13:05.751 }' 00:13:05.751 07:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.751 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.318 07:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:06.577 [2024-07-25 07:18:38.888939] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.577 [2024-07-25 07:18:38.888974] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:06.577 [2024-07-25 07:18:38.891887] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:06.577 [2024-07-25 07:18:38.891915] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:06.577 [2024-07-25 07:18:38.891940] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:06.577 [2024-07-25 07:18:38.891950] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2286860 name raid_bdev1, state offline 00:13:06.577 0 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1593937 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1593937 ']' 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1593937 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1593937 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1593937' 00:13:06.577 killing process with pid 1593937 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1593937 00:13:06.577 [2024-07-25 07:18:38.962109] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:06.577 07:18:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1593937 00:13:06.577 [2024-07-25 07:18:38.971586] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.ZmwHamOikj 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:13:06.836 00:13:06.836 real 0m5.802s 00:13:06.836 user 0m9.060s 00:13:06.836 sys 0m0.969s 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:06.836 07:18:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.836 ************************************ 00:13:06.836 END TEST raid_write_error_test 00:13:06.837 ************************************ 00:13:06.837 07:18:39 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:13:06.837 07:18:39 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:13:06.837 07:18:39 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:06.837 07:18:39 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:06.837 07:18:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:06.837 ************************************ 00:13:06.837 START TEST raid_state_function_test 00:13:06.837 ************************************ 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 false 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1594963 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1594963' 00:13:06.837 Process raid pid: 1594963 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1594963 /var/tmp/spdk-raid.sock 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1594963 ']' 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:06.837 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:06.837 07:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.837 [2024-07-25 07:18:39.330386] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:06.837 [2024-07-25 07:18:39.330467] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:07.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:07.097 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:07.097 [2024-07-25 07:18:39.466384] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.097 [2024-07-25 07:18:39.548719] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.097 [2024-07-25 07:18:39.604364] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.097 [2024-07-25 07:18:39.604396] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.665 07:18:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:07.665 07:18:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:13:07.665 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:07.924 [2024-07-25 07:18:40.377940] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:07.924 [2024-07-25 07:18:40.377982] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:07.924 [2024-07-25 07:18:40.377992] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:07.924 [2024-07-25 07:18:40.378003] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.924 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.183 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.183 "name": "Existed_Raid", 00:13:08.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.183 "strip_size_kb": 64, 00:13:08.183 "state": "configuring", 00:13:08.183 "raid_level": "concat", 00:13:08.183 "superblock": false, 00:13:08.183 "num_base_bdevs": 2, 00:13:08.183 "num_base_bdevs_discovered": 0, 00:13:08.183 "num_base_bdevs_operational": 2, 00:13:08.183 "base_bdevs_list": [ 00:13:08.183 { 00:13:08.183 "name": "BaseBdev1", 00:13:08.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.183 "is_configured": false, 00:13:08.183 "data_offset": 0, 00:13:08.183 "data_size": 0 00:13:08.183 }, 00:13:08.183 { 00:13:08.183 "name": "BaseBdev2", 00:13:08.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.183 "is_configured": false, 00:13:08.183 "data_offset": 0, 00:13:08.183 "data_size": 0 00:13:08.183 } 00:13:08.183 ] 00:13:08.183 }' 00:13:08.183 07:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.183 07:18:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.750 07:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.009 [2024-07-25 07:18:41.344360] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.009 [2024-07-25 07:18:41.344394] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b1ea0 name Existed_Raid, state configuring 00:13:09.009 07:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:09.269 [2024-07-25 07:18:41.572965] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.269 [2024-07-25 07:18:41.572989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.269 [2024-07-25 07:18:41.573003] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.269 [2024-07-25 07:18:41.573014] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:09.269 [2024-07-25 07:18:41.746832] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:09.269 BaseBdev1 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:09.269 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:09.528 07:18:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:09.787 [ 00:13:09.787 { 00:13:09.787 "name": "BaseBdev1", 00:13:09.787 "aliases": [ 00:13:09.787 "aa242db6-ab80-4efd-ab24-65beda732b3e" 00:13:09.787 ], 00:13:09.787 "product_name": "Malloc disk", 00:13:09.787 "block_size": 512, 00:13:09.787 "num_blocks": 65536, 00:13:09.787 "uuid": "aa242db6-ab80-4efd-ab24-65beda732b3e", 00:13:09.787 "assigned_rate_limits": { 00:13:09.787 "rw_ios_per_sec": 0, 00:13:09.787 "rw_mbytes_per_sec": 0, 00:13:09.787 "r_mbytes_per_sec": 0, 00:13:09.787 "w_mbytes_per_sec": 0 00:13:09.787 }, 00:13:09.787 "claimed": true, 00:13:09.787 "claim_type": "exclusive_write", 00:13:09.787 "zoned": false, 00:13:09.787 "supported_io_types": { 00:13:09.787 "read": true, 00:13:09.787 "write": true, 00:13:09.787 "unmap": true, 00:13:09.787 "flush": true, 00:13:09.787 "reset": true, 00:13:09.787 "nvme_admin": false, 00:13:09.787 "nvme_io": false, 00:13:09.787 "nvme_io_md": false, 00:13:09.787 "write_zeroes": true, 00:13:09.787 "zcopy": true, 00:13:09.787 "get_zone_info": false, 00:13:09.787 "zone_management": false, 00:13:09.787 "zone_append": false, 00:13:09.787 "compare": false, 00:13:09.787 "compare_and_write": false, 00:13:09.787 "abort": true, 00:13:09.787 "seek_hole": false, 00:13:09.787 "seek_data": false, 00:13:09.787 "copy": true, 00:13:09.787 "nvme_iov_md": false 00:13:09.787 }, 00:13:09.787 "memory_domains": [ 00:13:09.787 { 00:13:09.787 "dma_device_id": "system", 00:13:09.787 "dma_device_type": 1 00:13:09.787 }, 00:13:09.787 { 00:13:09.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.787 "dma_device_type": 2 00:13:09.787 } 00:13:09.787 ], 00:13:09.787 "driver_specific": {} 00:13:09.787 } 00:13:09.787 ] 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.787 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.047 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.047 "name": "Existed_Raid", 00:13:10.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.047 "strip_size_kb": 64, 00:13:10.047 "state": "configuring", 00:13:10.047 "raid_level": "concat", 00:13:10.047 "superblock": false, 00:13:10.047 "num_base_bdevs": 2, 00:13:10.047 "num_base_bdevs_discovered": 1, 00:13:10.047 "num_base_bdevs_operational": 2, 00:13:10.047 "base_bdevs_list": [ 00:13:10.047 { 00:13:10.047 "name": "BaseBdev1", 00:13:10.047 "uuid": "aa242db6-ab80-4efd-ab24-65beda732b3e", 00:13:10.047 "is_configured": true, 00:13:10.047 "data_offset": 0, 00:13:10.047 "data_size": 65536 00:13:10.047 }, 00:13:10.047 { 00:13:10.047 "name": "BaseBdev2", 00:13:10.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.047 "is_configured": false, 00:13:10.047 "data_offset": 0, 00:13:10.047 "data_size": 0 00:13:10.047 } 00:13:10.047 ] 00:13:10.047 }' 00:13:10.047 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.047 07:18:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.613 07:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:10.873 [2024-07-25 07:18:43.206675] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:10.873 [2024-07-25 07:18:43.206715] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b1790 name Existed_Raid, state configuring 00:13:10.873 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:11.441 [2024-07-25 07:18:43.703999] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.441 [2024-07-25 07:18:43.705495] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.441 [2024-07-25 07:18:43.705527] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.441 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.442 "name": "Existed_Raid", 00:13:11.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.442 "strip_size_kb": 64, 00:13:11.442 "state": "configuring", 00:13:11.442 "raid_level": "concat", 00:13:11.442 "superblock": false, 00:13:11.442 "num_base_bdevs": 2, 00:13:11.442 "num_base_bdevs_discovered": 1, 00:13:11.442 "num_base_bdevs_operational": 2, 00:13:11.442 "base_bdevs_list": [ 00:13:11.442 { 00:13:11.442 "name": "BaseBdev1", 00:13:11.442 "uuid": "aa242db6-ab80-4efd-ab24-65beda732b3e", 00:13:11.442 "is_configured": true, 00:13:11.442 "data_offset": 0, 00:13:11.442 "data_size": 65536 00:13:11.442 }, 00:13:11.442 { 00:13:11.442 "name": "BaseBdev2", 00:13:11.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.442 "is_configured": false, 00:13:11.442 "data_offset": 0, 00:13:11.442 "data_size": 0 00:13:11.442 } 00:13:11.442 ] 00:13:11.442 }' 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.442 07:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.010 07:18:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:12.579 [2024-07-25 07:18:45.002637] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.579 [2024-07-25 07:18:45.002672] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x9b2580 00:13:12.579 [2024-07-25 07:18:45.002680] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:12.579 [2024-07-25 07:18:45.002860] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a8d00 00:13:12.579 [2024-07-25 07:18:45.002978] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9b2580 00:13:12.579 [2024-07-25 07:18:45.002987] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9b2580 00:13:12.579 [2024-07-25 07:18:45.003137] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:12.579 BaseBdev2 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:12.579 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.838 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:13.407 [ 00:13:13.407 { 00:13:13.407 "name": "BaseBdev2", 00:13:13.407 "aliases": [ 00:13:13.407 "b3186b7d-2c75-4000-b415-8f699f6d93bf" 00:13:13.407 ], 00:13:13.407 "product_name": "Malloc disk", 00:13:13.407 "block_size": 512, 00:13:13.407 "num_blocks": 65536, 00:13:13.407 "uuid": "b3186b7d-2c75-4000-b415-8f699f6d93bf", 00:13:13.407 "assigned_rate_limits": { 00:13:13.407 "rw_ios_per_sec": 0, 00:13:13.407 "rw_mbytes_per_sec": 0, 00:13:13.407 "r_mbytes_per_sec": 0, 00:13:13.407 "w_mbytes_per_sec": 0 00:13:13.407 }, 00:13:13.407 "claimed": true, 00:13:13.407 "claim_type": "exclusive_write", 00:13:13.407 "zoned": false, 00:13:13.407 "supported_io_types": { 00:13:13.407 "read": true, 00:13:13.407 "write": true, 00:13:13.407 "unmap": true, 00:13:13.407 "flush": true, 00:13:13.407 "reset": true, 00:13:13.407 "nvme_admin": false, 00:13:13.407 "nvme_io": false, 00:13:13.407 "nvme_io_md": false, 00:13:13.407 "write_zeroes": true, 00:13:13.407 "zcopy": true, 00:13:13.407 "get_zone_info": false, 00:13:13.407 "zone_management": false, 00:13:13.407 "zone_append": false, 00:13:13.407 "compare": false, 00:13:13.407 "compare_and_write": false, 00:13:13.407 "abort": true, 00:13:13.407 "seek_hole": false, 00:13:13.407 "seek_data": false, 00:13:13.407 "copy": true, 00:13:13.407 "nvme_iov_md": false 00:13:13.407 }, 00:13:13.407 "memory_domains": [ 00:13:13.407 { 00:13:13.407 "dma_device_id": "system", 00:13:13.407 "dma_device_type": 1 00:13:13.407 }, 00:13:13.407 { 00:13:13.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.407 "dma_device_type": 2 00:13:13.407 } 00:13:13.407 ], 00:13:13.407 "driver_specific": {} 00:13:13.407 } 00:13:13.407 ] 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.407 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.667 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.667 "name": "Existed_Raid", 00:13:13.667 "uuid": "6f06f4f7-75d5-4787-8765-532e5cf29fe3", 00:13:13.667 "strip_size_kb": 64, 00:13:13.667 "state": "online", 00:13:13.667 "raid_level": "concat", 00:13:13.667 "superblock": false, 00:13:13.667 "num_base_bdevs": 2, 00:13:13.667 "num_base_bdevs_discovered": 2, 00:13:13.667 "num_base_bdevs_operational": 2, 00:13:13.667 "base_bdevs_list": [ 00:13:13.667 { 00:13:13.667 "name": "BaseBdev1", 00:13:13.667 "uuid": "aa242db6-ab80-4efd-ab24-65beda732b3e", 00:13:13.667 "is_configured": true, 00:13:13.667 "data_offset": 0, 00:13:13.667 "data_size": 65536 00:13:13.667 }, 00:13:13.667 { 00:13:13.667 "name": "BaseBdev2", 00:13:13.667 "uuid": "b3186b7d-2c75-4000-b415-8f699f6d93bf", 00:13:13.667 "is_configured": true, 00:13:13.667 "data_offset": 0, 00:13:13.667 "data_size": 65536 00:13:13.667 } 00:13:13.667 ] 00:13:13.667 }' 00:13:13.667 07:18:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.667 07:18:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.235 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.235 [2024-07-25 07:18:46.763543] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.495 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.495 "name": "Existed_Raid", 00:13:14.495 "aliases": [ 00:13:14.495 "6f06f4f7-75d5-4787-8765-532e5cf29fe3" 00:13:14.495 ], 00:13:14.495 "product_name": "Raid Volume", 00:13:14.495 "block_size": 512, 00:13:14.495 "num_blocks": 131072, 00:13:14.495 "uuid": "6f06f4f7-75d5-4787-8765-532e5cf29fe3", 00:13:14.495 "assigned_rate_limits": { 00:13:14.495 "rw_ios_per_sec": 0, 00:13:14.495 "rw_mbytes_per_sec": 0, 00:13:14.495 "r_mbytes_per_sec": 0, 00:13:14.495 "w_mbytes_per_sec": 0 00:13:14.495 }, 00:13:14.495 "claimed": false, 00:13:14.495 "zoned": false, 00:13:14.495 "supported_io_types": { 00:13:14.495 "read": true, 00:13:14.495 "write": true, 00:13:14.495 "unmap": true, 00:13:14.495 "flush": true, 00:13:14.495 "reset": true, 00:13:14.495 "nvme_admin": false, 00:13:14.495 "nvme_io": false, 00:13:14.495 "nvme_io_md": false, 00:13:14.495 "write_zeroes": true, 00:13:14.495 "zcopy": false, 00:13:14.495 "get_zone_info": false, 00:13:14.495 "zone_management": false, 00:13:14.495 "zone_append": false, 00:13:14.495 "compare": false, 00:13:14.495 "compare_and_write": false, 00:13:14.495 "abort": false, 00:13:14.495 "seek_hole": false, 00:13:14.495 "seek_data": false, 00:13:14.495 "copy": false, 00:13:14.495 "nvme_iov_md": false 00:13:14.495 }, 00:13:14.495 "memory_domains": [ 00:13:14.495 { 00:13:14.495 "dma_device_id": "system", 00:13:14.495 "dma_device_type": 1 00:13:14.495 }, 00:13:14.495 { 00:13:14.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.495 "dma_device_type": 2 00:13:14.495 }, 00:13:14.495 { 00:13:14.495 "dma_device_id": "system", 00:13:14.495 "dma_device_type": 1 00:13:14.495 }, 00:13:14.495 { 00:13:14.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.495 "dma_device_type": 2 00:13:14.495 } 00:13:14.495 ], 00:13:14.495 "driver_specific": { 00:13:14.495 "raid": { 00:13:14.495 "uuid": "6f06f4f7-75d5-4787-8765-532e5cf29fe3", 00:13:14.495 "strip_size_kb": 64, 00:13:14.495 "state": "online", 00:13:14.495 "raid_level": "concat", 00:13:14.495 "superblock": false, 00:13:14.495 "num_base_bdevs": 2, 00:13:14.495 "num_base_bdevs_discovered": 2, 00:13:14.495 "num_base_bdevs_operational": 2, 00:13:14.495 "base_bdevs_list": [ 00:13:14.495 { 00:13:14.495 "name": "BaseBdev1", 00:13:14.495 "uuid": "aa242db6-ab80-4efd-ab24-65beda732b3e", 00:13:14.495 "is_configured": true, 00:13:14.495 "data_offset": 0, 00:13:14.495 "data_size": 65536 00:13:14.495 }, 00:13:14.495 { 00:13:14.495 "name": "BaseBdev2", 00:13:14.495 "uuid": "b3186b7d-2c75-4000-b415-8f699f6d93bf", 00:13:14.495 "is_configured": true, 00:13:14.495 "data_offset": 0, 00:13:14.495 "data_size": 65536 00:13:14.495 } 00:13:14.495 ] 00:13:14.495 } 00:13:14.495 } 00:13:14.495 }' 00:13:14.495 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.495 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:14.495 BaseBdev2' 00:13:14.495 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.495 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.495 07:18:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.755 "name": "BaseBdev1", 00:13:14.755 "aliases": [ 00:13:14.755 "aa242db6-ab80-4efd-ab24-65beda732b3e" 00:13:14.755 ], 00:13:14.755 "product_name": "Malloc disk", 00:13:14.755 "block_size": 512, 00:13:14.755 "num_blocks": 65536, 00:13:14.755 "uuid": "aa242db6-ab80-4efd-ab24-65beda732b3e", 00:13:14.755 "assigned_rate_limits": { 00:13:14.755 "rw_ios_per_sec": 0, 00:13:14.755 "rw_mbytes_per_sec": 0, 00:13:14.755 "r_mbytes_per_sec": 0, 00:13:14.755 "w_mbytes_per_sec": 0 00:13:14.755 }, 00:13:14.755 "claimed": true, 00:13:14.755 "claim_type": "exclusive_write", 00:13:14.755 "zoned": false, 00:13:14.755 "supported_io_types": { 00:13:14.755 "read": true, 00:13:14.755 "write": true, 00:13:14.755 "unmap": true, 00:13:14.755 "flush": true, 00:13:14.755 "reset": true, 00:13:14.755 "nvme_admin": false, 00:13:14.755 "nvme_io": false, 00:13:14.755 "nvme_io_md": false, 00:13:14.755 "write_zeroes": true, 00:13:14.755 "zcopy": true, 00:13:14.755 "get_zone_info": false, 00:13:14.755 "zone_management": false, 00:13:14.755 "zone_append": false, 00:13:14.755 "compare": false, 00:13:14.755 "compare_and_write": false, 00:13:14.755 "abort": true, 00:13:14.755 "seek_hole": false, 00:13:14.755 "seek_data": false, 00:13:14.755 "copy": true, 00:13:14.755 "nvme_iov_md": false 00:13:14.755 }, 00:13:14.755 "memory_domains": [ 00:13:14.755 { 00:13:14.755 "dma_device_id": "system", 00:13:14.755 "dma_device_type": 1 00:13:14.755 }, 00:13:14.755 { 00:13:14.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.755 "dma_device_type": 2 00:13:14.755 } 00:13:14.755 ], 00:13:14.755 "driver_specific": {} 00:13:14.755 }' 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.755 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:15.014 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.274 "name": "BaseBdev2", 00:13:15.274 "aliases": [ 00:13:15.274 "b3186b7d-2c75-4000-b415-8f699f6d93bf" 00:13:15.274 ], 00:13:15.274 "product_name": "Malloc disk", 00:13:15.274 "block_size": 512, 00:13:15.274 "num_blocks": 65536, 00:13:15.274 "uuid": "b3186b7d-2c75-4000-b415-8f699f6d93bf", 00:13:15.274 "assigned_rate_limits": { 00:13:15.274 "rw_ios_per_sec": 0, 00:13:15.274 "rw_mbytes_per_sec": 0, 00:13:15.274 "r_mbytes_per_sec": 0, 00:13:15.274 "w_mbytes_per_sec": 0 00:13:15.274 }, 00:13:15.274 "claimed": true, 00:13:15.274 "claim_type": "exclusive_write", 00:13:15.274 "zoned": false, 00:13:15.274 "supported_io_types": { 00:13:15.274 "read": true, 00:13:15.274 "write": true, 00:13:15.274 "unmap": true, 00:13:15.274 "flush": true, 00:13:15.274 "reset": true, 00:13:15.274 "nvme_admin": false, 00:13:15.274 "nvme_io": false, 00:13:15.274 "nvme_io_md": false, 00:13:15.274 "write_zeroes": true, 00:13:15.274 "zcopy": true, 00:13:15.274 "get_zone_info": false, 00:13:15.274 "zone_management": false, 00:13:15.274 "zone_append": false, 00:13:15.274 "compare": false, 00:13:15.274 "compare_and_write": false, 00:13:15.274 "abort": true, 00:13:15.274 "seek_hole": false, 00:13:15.274 "seek_data": false, 00:13:15.274 "copy": true, 00:13:15.274 "nvme_iov_md": false 00:13:15.274 }, 00:13:15.274 "memory_domains": [ 00:13:15.274 { 00:13:15.274 "dma_device_id": "system", 00:13:15.274 "dma_device_type": 1 00:13:15.274 }, 00:13:15.274 { 00:13:15.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.274 "dma_device_type": 2 00:13:15.274 } 00:13:15.274 ], 00:13:15.274 "driver_specific": {} 00:13:15.274 }' 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.274 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.533 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.533 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.533 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.533 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.533 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.533 07:18:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:15.792 [2024-07-25 07:18:48.179054] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:15.792 [2024-07-25 07:18:48.179080] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:15.792 [2024-07-25 07:18:48.179116] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.792 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.793 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.793 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.793 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.052 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.052 "name": "Existed_Raid", 00:13:16.052 "uuid": "6f06f4f7-75d5-4787-8765-532e5cf29fe3", 00:13:16.052 "strip_size_kb": 64, 00:13:16.052 "state": "offline", 00:13:16.052 "raid_level": "concat", 00:13:16.052 "superblock": false, 00:13:16.052 "num_base_bdevs": 2, 00:13:16.052 "num_base_bdevs_discovered": 1, 00:13:16.052 "num_base_bdevs_operational": 1, 00:13:16.052 "base_bdevs_list": [ 00:13:16.052 { 00:13:16.052 "name": null, 00:13:16.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.052 "is_configured": false, 00:13:16.052 "data_offset": 0, 00:13:16.052 "data_size": 65536 00:13:16.052 }, 00:13:16.052 { 00:13:16.052 "name": "BaseBdev2", 00:13:16.052 "uuid": "b3186b7d-2c75-4000-b415-8f699f6d93bf", 00:13:16.052 "is_configured": true, 00:13:16.052 "data_offset": 0, 00:13:16.052 "data_size": 65536 00:13:16.052 } 00:13:16.052 ] 00:13:16.052 }' 00:13:16.052 07:18:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.052 07:18:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.620 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:16.620 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.620 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.620 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:16.880 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:16.880 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:16.880 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:17.139 [2024-07-25 07:18:49.451441] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:17.139 [2024-07-25 07:18:49.451488] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b2580 name Existed_Raid, state offline 00:13:17.139 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:17.139 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:17.139 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.139 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1594963 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1594963 ']' 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1594963 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1594963 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1594963' 00:13:17.399 killing process with pid 1594963 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1594963 00:13:17.399 [2024-07-25 07:18:49.763126] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:17.399 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1594963 00:13:17.399 [2024-07-25 07:18:49.763992] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:17.660 07:18:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:17.660 00:13:17.660 real 0m10.692s 00:13:17.660 user 0m19.047s 00:13:17.660 sys 0m1.951s 00:13:17.660 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:17.660 07:18:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.660 ************************************ 00:13:17.660 END TEST raid_state_function_test 00:13:17.660 ************************************ 00:13:17.660 07:18:49 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:13:17.660 07:18:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:17.660 07:18:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:17.660 07:18:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:17.660 ************************************ 00:13:17.660 START TEST raid_state_function_test_sb 00:13:17.660 ************************************ 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 true 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1597044 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1597044' 00:13:17.661 Process raid pid: 1597044 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1597044 /var/tmp/spdk-raid.sock 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1597044 ']' 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:17.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:17.661 07:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.661 [2024-07-25 07:18:50.109133] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:17.661 [2024-07-25 07:18:50.109211] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:17.661 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:17.661 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:18.006 [2024-07-25 07:18:50.244572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:18.006 [2024-07-25 07:18:50.331236] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:18.006 [2024-07-25 07:18:50.386750] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.006 [2024-07-25 07:18:50.386775] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.575 07:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:18.575 07:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:13:18.575 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:18.834 [2024-07-25 07:18:51.156335] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:18.834 [2024-07-25 07:18:51.156372] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:18.834 [2024-07-25 07:18:51.156382] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:18.834 [2024-07-25 07:18:51.156392] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.834 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.094 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.094 "name": "Existed_Raid", 00:13:19.094 "uuid": "493e37aa-2dd6-4bf9-8305-a533ffffde93", 00:13:19.094 "strip_size_kb": 64, 00:13:19.094 "state": "configuring", 00:13:19.094 "raid_level": "concat", 00:13:19.094 "superblock": true, 00:13:19.094 "num_base_bdevs": 2, 00:13:19.094 "num_base_bdevs_discovered": 0, 00:13:19.094 "num_base_bdevs_operational": 2, 00:13:19.094 "base_bdevs_list": [ 00:13:19.094 { 00:13:19.094 "name": "BaseBdev1", 00:13:19.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.094 "is_configured": false, 00:13:19.094 "data_offset": 0, 00:13:19.094 "data_size": 0 00:13:19.094 }, 00:13:19.094 { 00:13:19.094 "name": "BaseBdev2", 00:13:19.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.094 "is_configured": false, 00:13:19.094 "data_offset": 0, 00:13:19.094 "data_size": 0 00:13:19.094 } 00:13:19.094 ] 00:13:19.094 }' 00:13:19.094 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.094 07:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.662 07:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:19.662 [2024-07-25 07:18:52.186900] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:19.662 [2024-07-25 07:18:52.186927] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x197fea0 name Existed_Raid, state configuring 00:13:19.922 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:19.922 [2024-07-25 07:18:52.419534] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:19.922 [2024-07-25 07:18:52.419558] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:19.922 [2024-07-25 07:18:52.419567] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:19.922 [2024-07-25 07:18:52.419578] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:19.922 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:20.181 [2024-07-25 07:18:52.597400] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:20.181 BaseBdev1 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:20.181 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.440 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:20.440 [ 00:13:20.440 { 00:13:20.440 "name": "BaseBdev1", 00:13:20.440 "aliases": [ 00:13:20.440 "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed" 00:13:20.440 ], 00:13:20.440 "product_name": "Malloc disk", 00:13:20.440 "block_size": 512, 00:13:20.440 "num_blocks": 65536, 00:13:20.440 "uuid": "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed", 00:13:20.440 "assigned_rate_limits": { 00:13:20.440 "rw_ios_per_sec": 0, 00:13:20.441 "rw_mbytes_per_sec": 0, 00:13:20.441 "r_mbytes_per_sec": 0, 00:13:20.441 "w_mbytes_per_sec": 0 00:13:20.441 }, 00:13:20.441 "claimed": true, 00:13:20.441 "claim_type": "exclusive_write", 00:13:20.441 "zoned": false, 00:13:20.441 "supported_io_types": { 00:13:20.441 "read": true, 00:13:20.441 "write": true, 00:13:20.441 "unmap": true, 00:13:20.441 "flush": true, 00:13:20.441 "reset": true, 00:13:20.441 "nvme_admin": false, 00:13:20.441 "nvme_io": false, 00:13:20.441 "nvme_io_md": false, 00:13:20.441 "write_zeroes": true, 00:13:20.441 "zcopy": true, 00:13:20.441 "get_zone_info": false, 00:13:20.441 "zone_management": false, 00:13:20.441 "zone_append": false, 00:13:20.441 "compare": false, 00:13:20.441 "compare_and_write": false, 00:13:20.441 "abort": true, 00:13:20.441 "seek_hole": false, 00:13:20.441 "seek_data": false, 00:13:20.441 "copy": true, 00:13:20.441 "nvme_iov_md": false 00:13:20.441 }, 00:13:20.441 "memory_domains": [ 00:13:20.441 { 00:13:20.441 "dma_device_id": "system", 00:13:20.441 "dma_device_type": 1 00:13:20.441 }, 00:13:20.441 { 00:13:20.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.441 "dma_device_type": 2 00:13:20.441 } 00:13:20.441 ], 00:13:20.441 "driver_specific": {} 00:13:20.441 } 00:13:20.441 ] 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.441 07:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.700 07:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.700 "name": "Existed_Raid", 00:13:20.700 "uuid": "c0f67eed-450c-4dcc-a55b-d301296384b4", 00:13:20.700 "strip_size_kb": 64, 00:13:20.700 "state": "configuring", 00:13:20.700 "raid_level": "concat", 00:13:20.700 "superblock": true, 00:13:20.700 "num_base_bdevs": 2, 00:13:20.700 "num_base_bdevs_discovered": 1, 00:13:20.700 "num_base_bdevs_operational": 2, 00:13:20.700 "base_bdevs_list": [ 00:13:20.700 { 00:13:20.700 "name": "BaseBdev1", 00:13:20.700 "uuid": "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed", 00:13:20.700 "is_configured": true, 00:13:20.700 "data_offset": 2048, 00:13:20.700 "data_size": 63488 00:13:20.700 }, 00:13:20.700 { 00:13:20.700 "name": "BaseBdev2", 00:13:20.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.700 "is_configured": false, 00:13:20.700 "data_offset": 0, 00:13:20.700 "data_size": 0 00:13:20.700 } 00:13:20.700 ] 00:13:20.700 }' 00:13:20.700 07:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.700 07:18:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.269 07:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:21.529 [2024-07-25 07:18:53.964988] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:21.529 [2024-07-25 07:18:53.965027] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x197f790 name Existed_Raid, state configuring 00:13:21.529 07:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:21.788 [2024-07-25 07:18:54.189636] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:21.788 [2024-07-25 07:18:54.191038] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:21.788 [2024-07-25 07:18:54.191071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:21.788 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.789 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:22.048 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.048 "name": "Existed_Raid", 00:13:22.048 "uuid": "75cd39a1-4cdc-45c5-b17b-67931aa1932c", 00:13:22.048 "strip_size_kb": 64, 00:13:22.048 "state": "configuring", 00:13:22.048 "raid_level": "concat", 00:13:22.048 "superblock": true, 00:13:22.048 "num_base_bdevs": 2, 00:13:22.048 "num_base_bdevs_discovered": 1, 00:13:22.048 "num_base_bdevs_operational": 2, 00:13:22.048 "base_bdevs_list": [ 00:13:22.048 { 00:13:22.048 "name": "BaseBdev1", 00:13:22.048 "uuid": "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed", 00:13:22.048 "is_configured": true, 00:13:22.048 "data_offset": 2048, 00:13:22.048 "data_size": 63488 00:13:22.048 }, 00:13:22.048 { 00:13:22.048 "name": "BaseBdev2", 00:13:22.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.048 "is_configured": false, 00:13:22.048 "data_offset": 0, 00:13:22.048 "data_size": 0 00:13:22.048 } 00:13:22.048 ] 00:13:22.048 }' 00:13:22.048 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.048 07:18:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:22.615 07:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:22.616 [2024-07-25 07:18:55.115271] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:22.616 [2024-07-25 07:18:55.115410] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1980580 00:13:22.616 [2024-07-25 07:18:55.115423] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:22.616 [2024-07-25 07:18:55.115584] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19818d0 00:13:22.616 [2024-07-25 07:18:55.115697] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1980580 00:13:22.616 [2024-07-25 07:18:55.115706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1980580 00:13:22.616 [2024-07-25 07:18:55.115794] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.616 BaseBdev2 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:22.616 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:22.874 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:23.134 [ 00:13:23.134 { 00:13:23.134 "name": "BaseBdev2", 00:13:23.134 "aliases": [ 00:13:23.134 "e48da961-699e-4118-806f-fee7de0fdadf" 00:13:23.134 ], 00:13:23.134 "product_name": "Malloc disk", 00:13:23.134 "block_size": 512, 00:13:23.134 "num_blocks": 65536, 00:13:23.134 "uuid": "e48da961-699e-4118-806f-fee7de0fdadf", 00:13:23.134 "assigned_rate_limits": { 00:13:23.134 "rw_ios_per_sec": 0, 00:13:23.134 "rw_mbytes_per_sec": 0, 00:13:23.134 "r_mbytes_per_sec": 0, 00:13:23.134 "w_mbytes_per_sec": 0 00:13:23.134 }, 00:13:23.134 "claimed": true, 00:13:23.134 "claim_type": "exclusive_write", 00:13:23.134 "zoned": false, 00:13:23.134 "supported_io_types": { 00:13:23.134 "read": true, 00:13:23.134 "write": true, 00:13:23.134 "unmap": true, 00:13:23.134 "flush": true, 00:13:23.134 "reset": true, 00:13:23.134 "nvme_admin": false, 00:13:23.134 "nvme_io": false, 00:13:23.134 "nvme_io_md": false, 00:13:23.134 "write_zeroes": true, 00:13:23.134 "zcopy": true, 00:13:23.134 "get_zone_info": false, 00:13:23.134 "zone_management": false, 00:13:23.134 "zone_append": false, 00:13:23.134 "compare": false, 00:13:23.134 "compare_and_write": false, 00:13:23.134 "abort": true, 00:13:23.134 "seek_hole": false, 00:13:23.134 "seek_data": false, 00:13:23.134 "copy": true, 00:13:23.134 "nvme_iov_md": false 00:13:23.134 }, 00:13:23.134 "memory_domains": [ 00:13:23.134 { 00:13:23.134 "dma_device_id": "system", 00:13:23.134 "dma_device_type": 1 00:13:23.134 }, 00:13:23.134 { 00:13:23.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.134 "dma_device_type": 2 00:13:23.134 } 00:13:23.134 ], 00:13:23.134 "driver_specific": {} 00:13:23.134 } 00:13:23.134 ] 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.134 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.394 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.394 "name": "Existed_Raid", 00:13:23.394 "uuid": "75cd39a1-4cdc-45c5-b17b-67931aa1932c", 00:13:23.394 "strip_size_kb": 64, 00:13:23.394 "state": "online", 00:13:23.394 "raid_level": "concat", 00:13:23.394 "superblock": true, 00:13:23.394 "num_base_bdevs": 2, 00:13:23.394 "num_base_bdevs_discovered": 2, 00:13:23.394 "num_base_bdevs_operational": 2, 00:13:23.394 "base_bdevs_list": [ 00:13:23.394 { 00:13:23.394 "name": "BaseBdev1", 00:13:23.394 "uuid": "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed", 00:13:23.394 "is_configured": true, 00:13:23.394 "data_offset": 2048, 00:13:23.394 "data_size": 63488 00:13:23.394 }, 00:13:23.394 { 00:13:23.394 "name": "BaseBdev2", 00:13:23.394 "uuid": "e48da961-699e-4118-806f-fee7de0fdadf", 00:13:23.394 "is_configured": true, 00:13:23.394 "data_offset": 2048, 00:13:23.394 "data_size": 63488 00:13:23.394 } 00:13:23.394 ] 00:13:23.394 }' 00:13:23.394 07:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.394 07:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:23.963 [2024-07-25 07:18:56.439011] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:23.963 "name": "Existed_Raid", 00:13:23.963 "aliases": [ 00:13:23.963 "75cd39a1-4cdc-45c5-b17b-67931aa1932c" 00:13:23.963 ], 00:13:23.963 "product_name": "Raid Volume", 00:13:23.963 "block_size": 512, 00:13:23.963 "num_blocks": 126976, 00:13:23.963 "uuid": "75cd39a1-4cdc-45c5-b17b-67931aa1932c", 00:13:23.963 "assigned_rate_limits": { 00:13:23.963 "rw_ios_per_sec": 0, 00:13:23.963 "rw_mbytes_per_sec": 0, 00:13:23.963 "r_mbytes_per_sec": 0, 00:13:23.963 "w_mbytes_per_sec": 0 00:13:23.963 }, 00:13:23.963 "claimed": false, 00:13:23.963 "zoned": false, 00:13:23.963 "supported_io_types": { 00:13:23.963 "read": true, 00:13:23.963 "write": true, 00:13:23.963 "unmap": true, 00:13:23.963 "flush": true, 00:13:23.963 "reset": true, 00:13:23.963 "nvme_admin": false, 00:13:23.963 "nvme_io": false, 00:13:23.963 "nvme_io_md": false, 00:13:23.963 "write_zeroes": true, 00:13:23.963 "zcopy": false, 00:13:23.963 "get_zone_info": false, 00:13:23.963 "zone_management": false, 00:13:23.963 "zone_append": false, 00:13:23.963 "compare": false, 00:13:23.963 "compare_and_write": false, 00:13:23.963 "abort": false, 00:13:23.963 "seek_hole": false, 00:13:23.963 "seek_data": false, 00:13:23.963 "copy": false, 00:13:23.963 "nvme_iov_md": false 00:13:23.963 }, 00:13:23.963 "memory_domains": [ 00:13:23.963 { 00:13:23.963 "dma_device_id": "system", 00:13:23.963 "dma_device_type": 1 00:13:23.963 }, 00:13:23.963 { 00:13:23.963 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.963 "dma_device_type": 2 00:13:23.963 }, 00:13:23.963 { 00:13:23.963 "dma_device_id": "system", 00:13:23.963 "dma_device_type": 1 00:13:23.963 }, 00:13:23.963 { 00:13:23.963 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.963 "dma_device_type": 2 00:13:23.963 } 00:13:23.963 ], 00:13:23.963 "driver_specific": { 00:13:23.963 "raid": { 00:13:23.963 "uuid": "75cd39a1-4cdc-45c5-b17b-67931aa1932c", 00:13:23.963 "strip_size_kb": 64, 00:13:23.963 "state": "online", 00:13:23.963 "raid_level": "concat", 00:13:23.963 "superblock": true, 00:13:23.963 "num_base_bdevs": 2, 00:13:23.963 "num_base_bdevs_discovered": 2, 00:13:23.963 "num_base_bdevs_operational": 2, 00:13:23.963 "base_bdevs_list": [ 00:13:23.963 { 00:13:23.963 "name": "BaseBdev1", 00:13:23.963 "uuid": "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed", 00:13:23.963 "is_configured": true, 00:13:23.963 "data_offset": 2048, 00:13:23.963 "data_size": 63488 00:13:23.963 }, 00:13:23.963 { 00:13:23.963 "name": "BaseBdev2", 00:13:23.963 "uuid": "e48da961-699e-4118-806f-fee7de0fdadf", 00:13:23.963 "is_configured": true, 00:13:23.963 "data_offset": 2048, 00:13:23.963 "data_size": 63488 00:13:23.963 } 00:13:23.963 ] 00:13:23.963 } 00:13:23.963 } 00:13:23.963 }' 00:13:23.963 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:24.222 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:24.222 BaseBdev2' 00:13:24.222 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.222 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:24.222 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.222 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.222 "name": "BaseBdev1", 00:13:24.222 "aliases": [ 00:13:24.222 "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed" 00:13:24.222 ], 00:13:24.222 "product_name": "Malloc disk", 00:13:24.222 "block_size": 512, 00:13:24.222 "num_blocks": 65536, 00:13:24.222 "uuid": "6ebcf9e4-d7a7-4e17-8143-384b5de7c4ed", 00:13:24.222 "assigned_rate_limits": { 00:13:24.222 "rw_ios_per_sec": 0, 00:13:24.222 "rw_mbytes_per_sec": 0, 00:13:24.222 "r_mbytes_per_sec": 0, 00:13:24.222 "w_mbytes_per_sec": 0 00:13:24.222 }, 00:13:24.222 "claimed": true, 00:13:24.222 "claim_type": "exclusive_write", 00:13:24.222 "zoned": false, 00:13:24.222 "supported_io_types": { 00:13:24.222 "read": true, 00:13:24.222 "write": true, 00:13:24.222 "unmap": true, 00:13:24.222 "flush": true, 00:13:24.222 "reset": true, 00:13:24.222 "nvme_admin": false, 00:13:24.222 "nvme_io": false, 00:13:24.222 "nvme_io_md": false, 00:13:24.222 "write_zeroes": true, 00:13:24.222 "zcopy": true, 00:13:24.222 "get_zone_info": false, 00:13:24.222 "zone_management": false, 00:13:24.222 "zone_append": false, 00:13:24.222 "compare": false, 00:13:24.222 "compare_and_write": false, 00:13:24.222 "abort": true, 00:13:24.222 "seek_hole": false, 00:13:24.222 "seek_data": false, 00:13:24.222 "copy": true, 00:13:24.222 "nvme_iov_md": false 00:13:24.222 }, 00:13:24.222 "memory_domains": [ 00:13:24.222 { 00:13:24.222 "dma_device_id": "system", 00:13:24.223 "dma_device_type": 1 00:13:24.223 }, 00:13:24.223 { 00:13:24.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.223 "dma_device_type": 2 00:13:24.223 } 00:13:24.223 ], 00:13:24.223 "driver_specific": {} 00:13:24.223 }' 00:13:24.223 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.482 07:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.482 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.741 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.741 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.741 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.741 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:24.741 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.000 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:25.000 "name": "BaseBdev2", 00:13:25.000 "aliases": [ 00:13:25.000 "e48da961-699e-4118-806f-fee7de0fdadf" 00:13:25.000 ], 00:13:25.000 "product_name": "Malloc disk", 00:13:25.000 "block_size": 512, 00:13:25.000 "num_blocks": 65536, 00:13:25.000 "uuid": "e48da961-699e-4118-806f-fee7de0fdadf", 00:13:25.000 "assigned_rate_limits": { 00:13:25.000 "rw_ios_per_sec": 0, 00:13:25.000 "rw_mbytes_per_sec": 0, 00:13:25.000 "r_mbytes_per_sec": 0, 00:13:25.000 "w_mbytes_per_sec": 0 00:13:25.000 }, 00:13:25.000 "claimed": true, 00:13:25.000 "claim_type": "exclusive_write", 00:13:25.000 "zoned": false, 00:13:25.000 "supported_io_types": { 00:13:25.000 "read": true, 00:13:25.000 "write": true, 00:13:25.000 "unmap": true, 00:13:25.000 "flush": true, 00:13:25.000 "reset": true, 00:13:25.000 "nvme_admin": false, 00:13:25.000 "nvme_io": false, 00:13:25.000 "nvme_io_md": false, 00:13:25.000 "write_zeroes": true, 00:13:25.000 "zcopy": true, 00:13:25.000 "get_zone_info": false, 00:13:25.000 "zone_management": false, 00:13:25.000 "zone_append": false, 00:13:25.000 "compare": false, 00:13:25.000 "compare_and_write": false, 00:13:25.000 "abort": true, 00:13:25.000 "seek_hole": false, 00:13:25.000 "seek_data": false, 00:13:25.000 "copy": true, 00:13:25.000 "nvme_iov_md": false 00:13:25.000 }, 00:13:25.000 "memory_domains": [ 00:13:25.000 { 00:13:25.000 "dma_device_id": "system", 00:13:25.000 "dma_device_type": 1 00:13:25.000 }, 00:13:25.000 { 00:13:25.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.000 "dma_device_type": 2 00:13:25.000 } 00:13:25.000 ], 00:13:25.000 "driver_specific": {} 00:13:25.000 }' 00:13:25.000 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.000 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.001 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:25.001 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.001 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.001 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:25.001 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.001 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.260 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.260 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.260 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.260 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.260 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:25.519 [2024-07-25 07:18:57.826451] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:25.519 [2024-07-25 07:18:57.826476] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:25.519 [2024-07-25 07:18:57.826513] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.519 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.520 07:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.779 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.779 "name": "Existed_Raid", 00:13:25.779 "uuid": "75cd39a1-4cdc-45c5-b17b-67931aa1932c", 00:13:25.779 "strip_size_kb": 64, 00:13:25.779 "state": "offline", 00:13:25.779 "raid_level": "concat", 00:13:25.779 "superblock": true, 00:13:25.779 "num_base_bdevs": 2, 00:13:25.779 "num_base_bdevs_discovered": 1, 00:13:25.779 "num_base_bdevs_operational": 1, 00:13:25.779 "base_bdevs_list": [ 00:13:25.779 { 00:13:25.779 "name": null, 00:13:25.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.779 "is_configured": false, 00:13:25.779 "data_offset": 2048, 00:13:25.779 "data_size": 63488 00:13:25.779 }, 00:13:25.779 { 00:13:25.779 "name": "BaseBdev2", 00:13:25.779 "uuid": "e48da961-699e-4118-806f-fee7de0fdadf", 00:13:25.779 "is_configured": true, 00:13:25.779 "data_offset": 2048, 00:13:25.779 "data_size": 63488 00:13:25.779 } 00:13:25.779 ] 00:13:25.779 }' 00:13:25.779 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.779 07:18:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:26.347 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:26.347 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:26.347 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.347 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:26.607 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:26.607 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:26.607 07:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:26.607 [2024-07-25 07:18:59.098769] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:26.607 [2024-07-25 07:18:59.098817] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1980580 name Existed_Raid, state offline 00:13:26.607 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:26.607 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:26.607 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.607 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1597044 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1597044 ']' 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1597044 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:26.866 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1597044 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1597044' 00:13:27.125 killing process with pid 1597044 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1597044 00:13:27.125 [2024-07-25 07:18:59.407694] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1597044 00:13:27.125 [2024-07-25 07:18:59.408546] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:27.125 00:13:27.125 real 0m9.554s 00:13:27.125 user 0m16.935s 00:13:27.125 sys 0m1.815s 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:27.125 07:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:27.125 ************************************ 00:13:27.125 END TEST raid_state_function_test_sb 00:13:27.125 ************************************ 00:13:27.125 07:18:59 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:13:27.125 07:18:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:13:27.125 07:18:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:27.125 07:18:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:27.385 ************************************ 00:13:27.385 START TEST raid_superblock_test 00:13:27.385 ************************************ 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 2 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1598905 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1598905 /var/tmp/spdk-raid.sock 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1598905 ']' 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:27.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:27.385 07:18:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.385 [2024-07-25 07:18:59.737842] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:27.385 [2024-07-25 07:18:59.737896] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598905 ] 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:27.385 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.385 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:27.386 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.386 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:27.386 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.386 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:27.386 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.386 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:27.386 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.386 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:27.386 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.386 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:27.386 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:27.386 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:27.386 [2024-07-25 07:18:59.869791] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:27.645 [2024-07-25 07:18:59.956093] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:27.645 [2024-07-25 07:19:00.017229] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:27.645 [2024-07-25 07:19:00.017266] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:28.212 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:28.471 malloc1 00:13:28.471 07:19:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:28.730 [2024-07-25 07:19:01.075789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:28.730 [2024-07-25 07:19:01.075830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.730 [2024-07-25 07:19:01.075849] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1316280 00:13:28.730 [2024-07-25 07:19:01.075860] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.730 [2024-07-25 07:19:01.077622] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.730 [2024-07-25 07:19:01.077649] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:28.730 pt1 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:28.730 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:28.989 malloc2 00:13:28.989 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:29.249 [2024-07-25 07:19:01.537466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:29.249 [2024-07-25 07:19:01.537506] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:29.249 [2024-07-25 07:19:01.537521] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c18c0 00:13:29.249 [2024-07-25 07:19:01.537533] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:29.249 [2024-07-25 07:19:01.538882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:29.249 [2024-07-25 07:19:01.538909] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:29.249 pt2 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:13:29.249 [2024-07-25 07:19:01.762078] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:29.249 [2024-07-25 07:19:01.763247] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:29.249 [2024-07-25 07:19:01.763379] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14bf720 00:13:29.249 [2024-07-25 07:19:01.763392] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:29.249 [2024-07-25 07:19:01.763564] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13176e0 00:13:29.249 [2024-07-25 07:19:01.763689] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14bf720 00:13:29.249 [2024-07-25 07:19:01.763706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14bf720 00:13:29.249 [2024-07-25 07:19:01.763793] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.249 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.509 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.509 07:19:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:29.509 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.509 "name": "raid_bdev1", 00:13:29.509 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:29.509 "strip_size_kb": 64, 00:13:29.509 "state": "online", 00:13:29.509 "raid_level": "concat", 00:13:29.509 "superblock": true, 00:13:29.509 "num_base_bdevs": 2, 00:13:29.509 "num_base_bdevs_discovered": 2, 00:13:29.509 "num_base_bdevs_operational": 2, 00:13:29.509 "base_bdevs_list": [ 00:13:29.509 { 00:13:29.509 "name": "pt1", 00:13:29.509 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.509 "is_configured": true, 00:13:29.509 "data_offset": 2048, 00:13:29.509 "data_size": 63488 00:13:29.509 }, 00:13:29.509 { 00:13:29.509 "name": "pt2", 00:13:29.509 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.509 "is_configured": true, 00:13:29.509 "data_offset": 2048, 00:13:29.509 "data_size": 63488 00:13:29.509 } 00:13:29.509 ] 00:13:29.509 }' 00:13:29.509 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.509 07:19:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:30.076 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:30.334 [2024-07-25 07:19:02.700725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:30.334 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:30.334 "name": "raid_bdev1", 00:13:30.334 "aliases": [ 00:13:30.334 "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05" 00:13:30.334 ], 00:13:30.334 "product_name": "Raid Volume", 00:13:30.334 "block_size": 512, 00:13:30.334 "num_blocks": 126976, 00:13:30.334 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:30.334 "assigned_rate_limits": { 00:13:30.334 "rw_ios_per_sec": 0, 00:13:30.334 "rw_mbytes_per_sec": 0, 00:13:30.334 "r_mbytes_per_sec": 0, 00:13:30.334 "w_mbytes_per_sec": 0 00:13:30.334 }, 00:13:30.334 "claimed": false, 00:13:30.334 "zoned": false, 00:13:30.334 "supported_io_types": { 00:13:30.334 "read": true, 00:13:30.334 "write": true, 00:13:30.334 "unmap": true, 00:13:30.334 "flush": true, 00:13:30.334 "reset": true, 00:13:30.334 "nvme_admin": false, 00:13:30.334 "nvme_io": false, 00:13:30.334 "nvme_io_md": false, 00:13:30.334 "write_zeroes": true, 00:13:30.334 "zcopy": false, 00:13:30.334 "get_zone_info": false, 00:13:30.334 "zone_management": false, 00:13:30.334 "zone_append": false, 00:13:30.334 "compare": false, 00:13:30.334 "compare_and_write": false, 00:13:30.334 "abort": false, 00:13:30.334 "seek_hole": false, 00:13:30.334 "seek_data": false, 00:13:30.334 "copy": false, 00:13:30.334 "nvme_iov_md": false 00:13:30.334 }, 00:13:30.334 "memory_domains": [ 00:13:30.334 { 00:13:30.334 "dma_device_id": "system", 00:13:30.334 "dma_device_type": 1 00:13:30.334 }, 00:13:30.334 { 00:13:30.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.334 "dma_device_type": 2 00:13:30.334 }, 00:13:30.334 { 00:13:30.334 "dma_device_id": "system", 00:13:30.334 "dma_device_type": 1 00:13:30.334 }, 00:13:30.334 { 00:13:30.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.334 "dma_device_type": 2 00:13:30.334 } 00:13:30.334 ], 00:13:30.334 "driver_specific": { 00:13:30.334 "raid": { 00:13:30.334 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:30.334 "strip_size_kb": 64, 00:13:30.334 "state": "online", 00:13:30.334 "raid_level": "concat", 00:13:30.334 "superblock": true, 00:13:30.334 "num_base_bdevs": 2, 00:13:30.334 "num_base_bdevs_discovered": 2, 00:13:30.334 "num_base_bdevs_operational": 2, 00:13:30.334 "base_bdevs_list": [ 00:13:30.334 { 00:13:30.334 "name": "pt1", 00:13:30.334 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:30.334 "is_configured": true, 00:13:30.334 "data_offset": 2048, 00:13:30.334 "data_size": 63488 00:13:30.334 }, 00:13:30.334 { 00:13:30.334 "name": "pt2", 00:13:30.334 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:30.334 "is_configured": true, 00:13:30.334 "data_offset": 2048, 00:13:30.334 "data_size": 63488 00:13:30.334 } 00:13:30.334 ] 00:13:30.334 } 00:13:30.334 } 00:13:30.334 }' 00:13:30.334 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:30.334 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:30.334 pt2' 00:13:30.334 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.334 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:30.334 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.592 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.592 "name": "pt1", 00:13:30.592 "aliases": [ 00:13:30.592 "00000000-0000-0000-0000-000000000001" 00:13:30.592 ], 00:13:30.592 "product_name": "passthru", 00:13:30.592 "block_size": 512, 00:13:30.592 "num_blocks": 65536, 00:13:30.593 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:30.593 "assigned_rate_limits": { 00:13:30.593 "rw_ios_per_sec": 0, 00:13:30.593 "rw_mbytes_per_sec": 0, 00:13:30.593 "r_mbytes_per_sec": 0, 00:13:30.593 "w_mbytes_per_sec": 0 00:13:30.593 }, 00:13:30.593 "claimed": true, 00:13:30.593 "claim_type": "exclusive_write", 00:13:30.593 "zoned": false, 00:13:30.593 "supported_io_types": { 00:13:30.593 "read": true, 00:13:30.593 "write": true, 00:13:30.593 "unmap": true, 00:13:30.593 "flush": true, 00:13:30.593 "reset": true, 00:13:30.593 "nvme_admin": false, 00:13:30.593 "nvme_io": false, 00:13:30.593 "nvme_io_md": false, 00:13:30.593 "write_zeroes": true, 00:13:30.593 "zcopy": true, 00:13:30.593 "get_zone_info": false, 00:13:30.593 "zone_management": false, 00:13:30.593 "zone_append": false, 00:13:30.593 "compare": false, 00:13:30.593 "compare_and_write": false, 00:13:30.593 "abort": true, 00:13:30.593 "seek_hole": false, 00:13:30.593 "seek_data": false, 00:13:30.593 "copy": true, 00:13:30.593 "nvme_iov_md": false 00:13:30.593 }, 00:13:30.593 "memory_domains": [ 00:13:30.593 { 00:13:30.593 "dma_device_id": "system", 00:13:30.593 "dma_device_type": 1 00:13:30.593 }, 00:13:30.593 { 00:13:30.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.593 "dma_device_type": 2 00:13:30.593 } 00:13:30.593 ], 00:13:30.593 "driver_specific": { 00:13:30.593 "passthru": { 00:13:30.593 "name": "pt1", 00:13:30.593 "base_bdev_name": "malloc1" 00:13:30.593 } 00:13:30.593 } 00:13:30.593 }' 00:13:30.593 07:19:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.593 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.593 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.593 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.593 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:30.851 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:31.109 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:31.109 "name": "pt2", 00:13:31.109 "aliases": [ 00:13:31.109 "00000000-0000-0000-0000-000000000002" 00:13:31.109 ], 00:13:31.109 "product_name": "passthru", 00:13:31.109 "block_size": 512, 00:13:31.109 "num_blocks": 65536, 00:13:31.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:31.109 "assigned_rate_limits": { 00:13:31.109 "rw_ios_per_sec": 0, 00:13:31.109 "rw_mbytes_per_sec": 0, 00:13:31.109 "r_mbytes_per_sec": 0, 00:13:31.109 "w_mbytes_per_sec": 0 00:13:31.109 }, 00:13:31.109 "claimed": true, 00:13:31.109 "claim_type": "exclusive_write", 00:13:31.109 "zoned": false, 00:13:31.109 "supported_io_types": { 00:13:31.109 "read": true, 00:13:31.109 "write": true, 00:13:31.109 "unmap": true, 00:13:31.109 "flush": true, 00:13:31.109 "reset": true, 00:13:31.109 "nvme_admin": false, 00:13:31.109 "nvme_io": false, 00:13:31.109 "nvme_io_md": false, 00:13:31.109 "write_zeroes": true, 00:13:31.109 "zcopy": true, 00:13:31.109 "get_zone_info": false, 00:13:31.109 "zone_management": false, 00:13:31.109 "zone_append": false, 00:13:31.109 "compare": false, 00:13:31.109 "compare_and_write": false, 00:13:31.109 "abort": true, 00:13:31.109 "seek_hole": false, 00:13:31.109 "seek_data": false, 00:13:31.109 "copy": true, 00:13:31.109 "nvme_iov_md": false 00:13:31.109 }, 00:13:31.109 "memory_domains": [ 00:13:31.109 { 00:13:31.109 "dma_device_id": "system", 00:13:31.109 "dma_device_type": 1 00:13:31.109 }, 00:13:31.109 { 00:13:31.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.109 "dma_device_type": 2 00:13:31.109 } 00:13:31.109 ], 00:13:31.109 "driver_specific": { 00:13:31.109 "passthru": { 00:13:31.109 "name": "pt2", 00:13:31.109 "base_bdev_name": "malloc2" 00:13:31.109 } 00:13:31.109 } 00:13:31.109 }' 00:13:31.109 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.109 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.109 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.109 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:31.369 07:19:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:13:31.628 [2024-07-25 07:19:04.088388] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.628 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05 00:13:31.628 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05 ']' 00:13:31.628 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:31.929 [2024-07-25 07:19:04.312734] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:31.929 [2024-07-25 07:19:04.312749] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:31.929 [2024-07-25 07:19:04.312795] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:31.929 [2024-07-25 07:19:04.312834] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:31.929 [2024-07-25 07:19:04.312848] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14bf720 name raid_bdev1, state offline 00:13:31.929 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.929 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:13:32.217 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:13:32.217 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:13:32.217 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:13:32.217 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:32.476 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:13:32.476 07:19:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:32.735 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:13:32.994 [2024-07-25 07:19:05.467729] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:32.994 [2024-07-25 07:19:05.469004] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:32.994 [2024-07-25 07:19:05.469056] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:32.994 [2024-07-25 07:19:05.469092] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:32.994 [2024-07-25 07:19:05.469114] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:32.994 [2024-07-25 07:19:05.469123] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c2e30 name raid_bdev1, state configuring 00:13:32.994 request: 00:13:32.994 { 00:13:32.994 "name": "raid_bdev1", 00:13:32.994 "raid_level": "concat", 00:13:32.994 "base_bdevs": [ 00:13:32.994 "malloc1", 00:13:32.994 "malloc2" 00:13:32.994 ], 00:13:32.994 "strip_size_kb": 64, 00:13:32.994 "superblock": false, 00:13:32.994 "method": "bdev_raid_create", 00:13:32.994 "req_id": 1 00:13:32.994 } 00:13:32.994 Got JSON-RPC error response 00:13:32.994 response: 00:13:32.994 { 00:13:32.994 "code": -17, 00:13:32.994 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:32.994 } 00:13:32.994 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:13:32.994 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:13:32.994 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:13:32.994 07:19:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:13:32.994 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.994 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:13:33.253 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:13:33.253 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:13:33.253 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:33.512 [2024-07-25 07:19:05.924890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:33.512 [2024-07-25 07:19:05.924932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:33.512 [2024-07-25 07:19:05.924949] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bf490 00:13:33.512 [2024-07-25 07:19:05.924960] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:33.512 [2024-07-25 07:19:05.926420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:33.512 [2024-07-25 07:19:05.926446] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:33.512 [2024-07-25 07:19:05.926502] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:33.512 [2024-07-25 07:19:05.926526] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:33.512 pt1 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.512 07:19:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:33.771 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.771 "name": "raid_bdev1", 00:13:33.771 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:33.772 "strip_size_kb": 64, 00:13:33.772 "state": "configuring", 00:13:33.772 "raid_level": "concat", 00:13:33.772 "superblock": true, 00:13:33.772 "num_base_bdevs": 2, 00:13:33.772 "num_base_bdevs_discovered": 1, 00:13:33.772 "num_base_bdevs_operational": 2, 00:13:33.772 "base_bdevs_list": [ 00:13:33.772 { 00:13:33.772 "name": "pt1", 00:13:33.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:33.772 "is_configured": true, 00:13:33.772 "data_offset": 2048, 00:13:33.772 "data_size": 63488 00:13:33.772 }, 00:13:33.772 { 00:13:33.772 "name": null, 00:13:33.772 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:33.772 "is_configured": false, 00:13:33.772 "data_offset": 2048, 00:13:33.772 "data_size": 63488 00:13:33.772 } 00:13:33.772 ] 00:13:33.772 }' 00:13:33.772 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.772 07:19:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.339 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:13:34.339 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:13:34.339 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:13:34.339 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:34.598 [2024-07-25 07:19:06.947730] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:34.598 [2024-07-25 07:19:06.947773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.598 [2024-07-25 07:19:06.947790] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c34f0 00:13:34.598 [2024-07-25 07:19:06.947801] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.598 [2024-07-25 07:19:06.948099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.598 [2024-07-25 07:19:06.948114] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:34.598 [2024-07-25 07:19:06.948178] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:34.598 [2024-07-25 07:19:06.948195] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:34.598 [2024-07-25 07:19:06.948282] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1315220 00:13:34.598 [2024-07-25 07:19:06.948291] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:34.598 [2024-07-25 07:19:06.948445] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c2800 00:13:34.598 [2024-07-25 07:19:06.948555] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1315220 00:13:34.598 [2024-07-25 07:19:06.948564] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1315220 00:13:34.598 [2024-07-25 07:19:06.948649] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.598 pt2 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.598 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:34.599 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.599 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.599 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.599 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.599 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.599 07:19:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:34.858 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.858 "name": "raid_bdev1", 00:13:34.858 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:34.858 "strip_size_kb": 64, 00:13:34.858 "state": "online", 00:13:34.858 "raid_level": "concat", 00:13:34.858 "superblock": true, 00:13:34.858 "num_base_bdevs": 2, 00:13:34.858 "num_base_bdevs_discovered": 2, 00:13:34.858 "num_base_bdevs_operational": 2, 00:13:34.858 "base_bdevs_list": [ 00:13:34.858 { 00:13:34.858 "name": "pt1", 00:13:34.859 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:34.859 "is_configured": true, 00:13:34.859 "data_offset": 2048, 00:13:34.859 "data_size": 63488 00:13:34.859 }, 00:13:34.859 { 00:13:34.859 "name": "pt2", 00:13:34.859 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:34.859 "is_configured": true, 00:13:34.859 "data_offset": 2048, 00:13:34.859 "data_size": 63488 00:13:34.859 } 00:13:34.859 ] 00:13:34.859 }' 00:13:34.859 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.859 07:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:35.427 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:35.687 [2024-07-25 07:19:07.966624] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:35.687 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:35.687 "name": "raid_bdev1", 00:13:35.687 "aliases": [ 00:13:35.687 "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05" 00:13:35.687 ], 00:13:35.687 "product_name": "Raid Volume", 00:13:35.687 "block_size": 512, 00:13:35.687 "num_blocks": 126976, 00:13:35.687 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:35.687 "assigned_rate_limits": { 00:13:35.687 "rw_ios_per_sec": 0, 00:13:35.687 "rw_mbytes_per_sec": 0, 00:13:35.687 "r_mbytes_per_sec": 0, 00:13:35.687 "w_mbytes_per_sec": 0 00:13:35.687 }, 00:13:35.687 "claimed": false, 00:13:35.687 "zoned": false, 00:13:35.687 "supported_io_types": { 00:13:35.687 "read": true, 00:13:35.687 "write": true, 00:13:35.687 "unmap": true, 00:13:35.687 "flush": true, 00:13:35.687 "reset": true, 00:13:35.687 "nvme_admin": false, 00:13:35.687 "nvme_io": false, 00:13:35.687 "nvme_io_md": false, 00:13:35.687 "write_zeroes": true, 00:13:35.687 "zcopy": false, 00:13:35.687 "get_zone_info": false, 00:13:35.687 "zone_management": false, 00:13:35.687 "zone_append": false, 00:13:35.687 "compare": false, 00:13:35.687 "compare_and_write": false, 00:13:35.687 "abort": false, 00:13:35.687 "seek_hole": false, 00:13:35.687 "seek_data": false, 00:13:35.687 "copy": false, 00:13:35.687 "nvme_iov_md": false 00:13:35.687 }, 00:13:35.687 "memory_domains": [ 00:13:35.687 { 00:13:35.687 "dma_device_id": "system", 00:13:35.687 "dma_device_type": 1 00:13:35.687 }, 00:13:35.687 { 00:13:35.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.687 "dma_device_type": 2 00:13:35.687 }, 00:13:35.687 { 00:13:35.687 "dma_device_id": "system", 00:13:35.687 "dma_device_type": 1 00:13:35.687 }, 00:13:35.687 { 00:13:35.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.687 "dma_device_type": 2 00:13:35.687 } 00:13:35.687 ], 00:13:35.687 "driver_specific": { 00:13:35.687 "raid": { 00:13:35.687 "uuid": "6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05", 00:13:35.687 "strip_size_kb": 64, 00:13:35.687 "state": "online", 00:13:35.687 "raid_level": "concat", 00:13:35.687 "superblock": true, 00:13:35.687 "num_base_bdevs": 2, 00:13:35.687 "num_base_bdevs_discovered": 2, 00:13:35.687 "num_base_bdevs_operational": 2, 00:13:35.687 "base_bdevs_list": [ 00:13:35.687 { 00:13:35.687 "name": "pt1", 00:13:35.687 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:35.687 "is_configured": true, 00:13:35.687 "data_offset": 2048, 00:13:35.687 "data_size": 63488 00:13:35.687 }, 00:13:35.687 { 00:13:35.687 "name": "pt2", 00:13:35.687 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:35.687 "is_configured": true, 00:13:35.687 "data_offset": 2048, 00:13:35.687 "data_size": 63488 00:13:35.687 } 00:13:35.687 ] 00:13:35.687 } 00:13:35.687 } 00:13:35.687 }' 00:13:35.687 07:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:35.687 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:35.687 pt2' 00:13:35.687 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:35.687 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:35.687 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:35.947 "name": "pt1", 00:13:35.947 "aliases": [ 00:13:35.947 "00000000-0000-0000-0000-000000000001" 00:13:35.947 ], 00:13:35.947 "product_name": "passthru", 00:13:35.947 "block_size": 512, 00:13:35.947 "num_blocks": 65536, 00:13:35.947 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:35.947 "assigned_rate_limits": { 00:13:35.947 "rw_ios_per_sec": 0, 00:13:35.947 "rw_mbytes_per_sec": 0, 00:13:35.947 "r_mbytes_per_sec": 0, 00:13:35.947 "w_mbytes_per_sec": 0 00:13:35.947 }, 00:13:35.947 "claimed": true, 00:13:35.947 "claim_type": "exclusive_write", 00:13:35.947 "zoned": false, 00:13:35.947 "supported_io_types": { 00:13:35.947 "read": true, 00:13:35.947 "write": true, 00:13:35.947 "unmap": true, 00:13:35.947 "flush": true, 00:13:35.947 "reset": true, 00:13:35.947 "nvme_admin": false, 00:13:35.947 "nvme_io": false, 00:13:35.947 "nvme_io_md": false, 00:13:35.947 "write_zeroes": true, 00:13:35.947 "zcopy": true, 00:13:35.947 "get_zone_info": false, 00:13:35.947 "zone_management": false, 00:13:35.947 "zone_append": false, 00:13:35.947 "compare": false, 00:13:35.947 "compare_and_write": false, 00:13:35.947 "abort": true, 00:13:35.947 "seek_hole": false, 00:13:35.947 "seek_data": false, 00:13:35.947 "copy": true, 00:13:35.947 "nvme_iov_md": false 00:13:35.947 }, 00:13:35.947 "memory_domains": [ 00:13:35.947 { 00:13:35.947 "dma_device_id": "system", 00:13:35.947 "dma_device_type": 1 00:13:35.947 }, 00:13:35.947 { 00:13:35.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.947 "dma_device_type": 2 00:13:35.947 } 00:13:35.947 ], 00:13:35.947 "driver_specific": { 00:13:35.947 "passthru": { 00:13:35.947 "name": "pt1", 00:13:35.947 "base_bdev_name": "malloc1" 00:13:35.947 } 00:13:35.947 } 00:13:35.947 }' 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:35.947 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:36.206 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.466 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.466 "name": "pt2", 00:13:36.466 "aliases": [ 00:13:36.466 "00000000-0000-0000-0000-000000000002" 00:13:36.466 ], 00:13:36.466 "product_name": "passthru", 00:13:36.466 "block_size": 512, 00:13:36.466 "num_blocks": 65536, 00:13:36.466 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:36.466 "assigned_rate_limits": { 00:13:36.466 "rw_ios_per_sec": 0, 00:13:36.466 "rw_mbytes_per_sec": 0, 00:13:36.466 "r_mbytes_per_sec": 0, 00:13:36.466 "w_mbytes_per_sec": 0 00:13:36.466 }, 00:13:36.466 "claimed": true, 00:13:36.466 "claim_type": "exclusive_write", 00:13:36.466 "zoned": false, 00:13:36.466 "supported_io_types": { 00:13:36.466 "read": true, 00:13:36.466 "write": true, 00:13:36.466 "unmap": true, 00:13:36.466 "flush": true, 00:13:36.466 "reset": true, 00:13:36.466 "nvme_admin": false, 00:13:36.466 "nvme_io": false, 00:13:36.466 "nvme_io_md": false, 00:13:36.466 "write_zeroes": true, 00:13:36.466 "zcopy": true, 00:13:36.466 "get_zone_info": false, 00:13:36.466 "zone_management": false, 00:13:36.466 "zone_append": false, 00:13:36.466 "compare": false, 00:13:36.466 "compare_and_write": false, 00:13:36.466 "abort": true, 00:13:36.466 "seek_hole": false, 00:13:36.466 "seek_data": false, 00:13:36.466 "copy": true, 00:13:36.466 "nvme_iov_md": false 00:13:36.466 }, 00:13:36.466 "memory_domains": [ 00:13:36.466 { 00:13:36.466 "dma_device_id": "system", 00:13:36.466 "dma_device_type": 1 00:13:36.466 }, 00:13:36.466 { 00:13:36.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.466 "dma_device_type": 2 00:13:36.466 } 00:13:36.466 ], 00:13:36.466 "driver_specific": { 00:13:36.466 "passthru": { 00:13:36.466 "name": "pt2", 00:13:36.466 "base_bdev_name": "malloc2" 00:13:36.466 } 00:13:36.466 } 00:13:36.466 }' 00:13:36.466 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.466 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.466 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.466 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.466 07:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:36.725 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:13:36.985 [2024-07-25 07:19:09.386378] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05 '!=' 6fa1b0b4-eb7f-49ac-a585-01cb8c77cd05 ']' 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1598905 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1598905 ']' 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1598905 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1598905 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1598905' 00:13:36.985 killing process with pid 1598905 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1598905 00:13:36.985 [2024-07-25 07:19:09.461321] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:36.985 [2024-07-25 07:19:09.461369] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:36.985 [2024-07-25 07:19:09.461407] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:36.985 [2024-07-25 07:19:09.461418] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1315220 name raid_bdev1, state offline 00:13:36.985 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1598905 00:13:36.985 [2024-07-25 07:19:09.477272] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.245 07:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:13:37.245 00:13:37.245 real 0m9.989s 00:13:37.245 user 0m17.752s 00:13:37.245 sys 0m1.925s 00:13:37.245 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:37.245 07:19:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.245 ************************************ 00:13:37.245 END TEST raid_superblock_test 00:13:37.245 ************************************ 00:13:37.245 07:19:09 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:13:37.245 07:19:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:37.245 07:19:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:37.245 07:19:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.245 ************************************ 00:13:37.245 START TEST raid_read_error_test 00:13:37.245 ************************************ 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 read 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.JQ5ycSDBGo 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1600924 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1600924 /var/tmp/spdk-raid.sock 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1600924 ']' 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:37.245 07:19:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.505 [2024-07-25 07:19:09.801178] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:37.505 [2024-07-25 07:19:09.801238] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1600924 ] 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:37.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:37.505 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:37.505 [2024-07-25 07:19:09.932260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:37.505 [2024-07-25 07:19:10.022742] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:37.765 [2024-07-25 07:19:10.080030] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:37.765 [2024-07-25 07:19:10.080061] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.333 07:19:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:38.333 07:19:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:38.333 07:19:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:38.333 07:19:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:38.592 BaseBdev1_malloc 00:13:38.592 07:19:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:38.851 true 00:13:38.851 07:19:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:38.851 [2024-07-25 07:19:11.372030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:38.851 [2024-07-25 07:19:11.372069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:38.851 [2024-07-25 07:19:11.372088] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232ea50 00:13:38.851 [2024-07-25 07:19:11.372100] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:38.851 [2024-07-25 07:19:11.373572] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:38.851 [2024-07-25 07:19:11.373599] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:38.851 BaseBdev1 00:13:39.111 07:19:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:39.111 07:19:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:39.111 BaseBdev2_malloc 00:13:39.111 07:19:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:39.370 true 00:13:39.370 07:19:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:39.629 [2024-07-25 07:19:12.054115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:39.629 [2024-07-25 07:19:12.054159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.629 [2024-07-25 07:19:12.054177] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d7f40 00:13:39.629 [2024-07-25 07:19:12.054189] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.629 [2024-07-25 07:19:12.055569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.629 [2024-07-25 07:19:12.055594] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:39.629 BaseBdev2 00:13:39.629 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:39.888 [2024-07-25 07:19:12.278730] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:39.888 [2024-07-25 07:19:12.279896] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:39.888 [2024-07-25 07:19:12.280077] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24da860 00:13:39.888 [2024-07-25 07:19:12.280091] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:39.888 [2024-07-25 07:19:12.280275] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24da280 00:13:39.888 [2024-07-25 07:19:12.280412] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24da860 00:13:39.888 [2024-07-25 07:19:12.280421] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24da860 00:13:39.888 [2024-07-25 07:19:12.280518] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:39.888 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:39.888 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:39.888 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:39.888 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.889 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:40.148 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.148 "name": "raid_bdev1", 00:13:40.148 "uuid": "2512a1cd-755d-4720-ba63-0dfd5e49a6b8", 00:13:40.148 "strip_size_kb": 64, 00:13:40.148 "state": "online", 00:13:40.148 "raid_level": "concat", 00:13:40.148 "superblock": true, 00:13:40.148 "num_base_bdevs": 2, 00:13:40.148 "num_base_bdevs_discovered": 2, 00:13:40.148 "num_base_bdevs_operational": 2, 00:13:40.148 "base_bdevs_list": [ 00:13:40.148 { 00:13:40.148 "name": "BaseBdev1", 00:13:40.148 "uuid": "5b051047-a9c3-5564-8757-854d6bff12ac", 00:13:40.148 "is_configured": true, 00:13:40.148 "data_offset": 2048, 00:13:40.148 "data_size": 63488 00:13:40.148 }, 00:13:40.148 { 00:13:40.148 "name": "BaseBdev2", 00:13:40.148 "uuid": "3083d162-2a4f-5343-872b-a885d0cb592f", 00:13:40.148 "is_configured": true, 00:13:40.148 "data_offset": 2048, 00:13:40.148 "data_size": 63488 00:13:40.148 } 00:13:40.148 ] 00:13:40.148 }' 00:13:40.148 07:19:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.148 07:19:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.716 07:19:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:40.716 07:19:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:40.716 [2024-07-25 07:19:13.185352] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24da280 00:13:41.653 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:41.912 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.913 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.172 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.172 "name": "raid_bdev1", 00:13:42.172 "uuid": "2512a1cd-755d-4720-ba63-0dfd5e49a6b8", 00:13:42.172 "strip_size_kb": 64, 00:13:42.172 "state": "online", 00:13:42.172 "raid_level": "concat", 00:13:42.172 "superblock": true, 00:13:42.172 "num_base_bdevs": 2, 00:13:42.172 "num_base_bdevs_discovered": 2, 00:13:42.172 "num_base_bdevs_operational": 2, 00:13:42.172 "base_bdevs_list": [ 00:13:42.172 { 00:13:42.172 "name": "BaseBdev1", 00:13:42.172 "uuid": "5b051047-a9c3-5564-8757-854d6bff12ac", 00:13:42.172 "is_configured": true, 00:13:42.172 "data_offset": 2048, 00:13:42.172 "data_size": 63488 00:13:42.173 }, 00:13:42.173 { 00:13:42.173 "name": "BaseBdev2", 00:13:42.173 "uuid": "3083d162-2a4f-5343-872b-a885d0cb592f", 00:13:42.173 "is_configured": true, 00:13:42.173 "data_offset": 2048, 00:13:42.173 "data_size": 63488 00:13:42.173 } 00:13:42.173 ] 00:13:42.173 }' 00:13:42.173 07:19:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.173 07:19:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.741 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:43.000 [2024-07-25 07:19:15.331719] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:43.000 [2024-07-25 07:19:15.331754] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.000 [2024-07-25 07:19:15.334700] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.000 [2024-07-25 07:19:15.334728] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:43.000 [2024-07-25 07:19:15.334751] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.000 [2024-07-25 07:19:15.334761] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24da860 name raid_bdev1, state offline 00:13:43.000 0 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1600924 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1600924 ']' 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1600924 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1600924 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1600924' 00:13:43.000 killing process with pid 1600924 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1600924 00:13:43.000 [2024-07-25 07:19:15.404269] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:43.000 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1600924 00:13:43.000 [2024-07-25 07:19:15.414061] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.JQ5ycSDBGo 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:13:43.259 00:13:43.259 real 0m5.888s 00:13:43.259 user 0m9.120s 00:13:43.259 sys 0m1.039s 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:43.259 07:19:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.259 ************************************ 00:13:43.259 END TEST raid_read_error_test 00:13:43.259 ************************************ 00:13:43.259 07:19:15 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:13:43.259 07:19:15 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:43.259 07:19:15 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:43.259 07:19:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:43.259 ************************************ 00:13:43.259 START TEST raid_write_error_test 00:13:43.259 ************************************ 00:13:43.259 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 write 00:13:43.259 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:13:43.259 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:43.259 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:13:43.259 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:43.259 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.g3h1iGetxh 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1602073 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1602073 /var/tmp/spdk-raid.sock 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1602073 ']' 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:43.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:43.260 07:19:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.260 [2024-07-25 07:19:15.783700] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:43.260 [2024-07-25 07:19:15.783759] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1602073 ] 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:43.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:43.519 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:43.519 [2024-07-25 07:19:15.916966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:43.519 [2024-07-25 07:19:15.998923] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:43.778 [2024-07-25 07:19:16.057388] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:43.778 [2024-07-25 07:19:16.057422] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.347 07:19:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:44.347 07:19:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:44.347 07:19:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:44.347 07:19:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:44.347 BaseBdev1_malloc 00:13:44.347 07:19:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:44.605 true 00:13:44.605 07:19:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:44.865 [2024-07-25 07:19:17.290496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:44.865 [2024-07-25 07:19:17.290540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:44.865 [2024-07-25 07:19:17.290558] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8caa50 00:13:44.865 [2024-07-25 07:19:17.290569] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:44.865 [2024-07-25 07:19:17.292021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:44.865 [2024-07-25 07:19:17.292049] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:44.865 BaseBdev1 00:13:44.865 07:19:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:44.865 07:19:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:45.124 BaseBdev2_malloc 00:13:45.124 07:19:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:45.382 true 00:13:45.382 07:19:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:45.641 [2024-07-25 07:19:17.980549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:45.641 [2024-07-25 07:19:17.980587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:45.641 [2024-07-25 07:19:17.980604] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa73f40 00:13:45.641 [2024-07-25 07:19:17.980616] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:45.641 [2024-07-25 07:19:17.981886] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:45.641 [2024-07-25 07:19:17.981911] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:45.641 BaseBdev2 00:13:45.641 07:19:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:45.900 [2024-07-25 07:19:18.209196] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:45.900 [2024-07-25 07:19:18.210311] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:45.900 [2024-07-25 07:19:18.210488] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa76860 00:13:45.900 [2024-07-25 07:19:18.210501] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:45.900 [2024-07-25 07:19:18.210661] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa76280 00:13:45.900 [2024-07-25 07:19:18.210791] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa76860 00:13:45.900 [2024-07-25 07:19:18.210800] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa76860 00:13:45.900 [2024-07-25 07:19:18.210892] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.900 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:46.201 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.201 "name": "raid_bdev1", 00:13:46.201 "uuid": "4f988e36-1efb-4306-80d8-8f74cf934efd", 00:13:46.201 "strip_size_kb": 64, 00:13:46.201 "state": "online", 00:13:46.201 "raid_level": "concat", 00:13:46.201 "superblock": true, 00:13:46.201 "num_base_bdevs": 2, 00:13:46.201 "num_base_bdevs_discovered": 2, 00:13:46.201 "num_base_bdevs_operational": 2, 00:13:46.201 "base_bdevs_list": [ 00:13:46.201 { 00:13:46.201 "name": "BaseBdev1", 00:13:46.201 "uuid": "3f8d0f09-5521-5ca2-8b66-8e07c256cbe8", 00:13:46.201 "is_configured": true, 00:13:46.201 "data_offset": 2048, 00:13:46.201 "data_size": 63488 00:13:46.201 }, 00:13:46.201 { 00:13:46.201 "name": "BaseBdev2", 00:13:46.201 "uuid": "f0b562b2-e151-5c62-b3c2-900e01e19007", 00:13:46.201 "is_configured": true, 00:13:46.201 "data_offset": 2048, 00:13:46.201 "data_size": 63488 00:13:46.201 } 00:13:46.201 ] 00:13:46.201 }' 00:13:46.201 07:19:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.201 07:19:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.770 07:19:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:46.770 07:19:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:46.770 [2024-07-25 07:19:19.139873] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa76280 00:13:47.709 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.968 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:48.228 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.228 "name": "raid_bdev1", 00:13:48.228 "uuid": "4f988e36-1efb-4306-80d8-8f74cf934efd", 00:13:48.228 "strip_size_kb": 64, 00:13:48.228 "state": "online", 00:13:48.228 "raid_level": "concat", 00:13:48.228 "superblock": true, 00:13:48.228 "num_base_bdevs": 2, 00:13:48.228 "num_base_bdevs_discovered": 2, 00:13:48.228 "num_base_bdevs_operational": 2, 00:13:48.228 "base_bdevs_list": [ 00:13:48.228 { 00:13:48.228 "name": "BaseBdev1", 00:13:48.228 "uuid": "3f8d0f09-5521-5ca2-8b66-8e07c256cbe8", 00:13:48.228 "is_configured": true, 00:13:48.228 "data_offset": 2048, 00:13:48.228 "data_size": 63488 00:13:48.228 }, 00:13:48.228 { 00:13:48.228 "name": "BaseBdev2", 00:13:48.228 "uuid": "f0b562b2-e151-5c62-b3c2-900e01e19007", 00:13:48.228 "is_configured": true, 00:13:48.228 "data_offset": 2048, 00:13:48.228 "data_size": 63488 00:13:48.228 } 00:13:48.228 ] 00:13:48.228 }' 00:13:48.228 07:19:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.228 07:19:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:48.798 [2024-07-25 07:19:21.282033] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:48.798 [2024-07-25 07:19:21.282075] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:48.798 [2024-07-25 07:19:21.285077] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:48.798 [2024-07-25 07:19:21.285107] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:48.798 [2024-07-25 07:19:21.285132] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:48.798 [2024-07-25 07:19:21.285148] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa76860 name raid_bdev1, state offline 00:13:48.798 0 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1602073 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1602073 ']' 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1602073 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:48.798 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1602073 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1602073' 00:13:49.058 killing process with pid 1602073 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1602073 00:13:49.058 [2024-07-25 07:19:21.356928] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1602073 00:13:49.058 [2024-07-25 07:19:21.366534] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.g3h1iGetxh 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:13:49.058 00:13:49.058 real 0m5.871s 00:13:49.058 user 0m9.071s 00:13:49.058 sys 0m1.055s 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:49.058 07:19:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.058 ************************************ 00:13:49.058 END TEST raid_write_error_test 00:13:49.058 ************************************ 00:13:49.318 07:19:21 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:13:49.318 07:19:21 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:13:49.318 07:19:21 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:49.318 07:19:21 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:49.318 07:19:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:49.318 ************************************ 00:13:49.319 START TEST raid_state_function_test 00:13:49.319 ************************************ 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 false 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1603083 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1603083' 00:13:49.319 Process raid pid: 1603083 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1603083 /var/tmp/spdk-raid.sock 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1603083 ']' 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:49.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:49.319 07:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.319 [2024-07-25 07:19:21.734795] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:49.319 [2024-07-25 07:19:21.734852] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:49.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:49.319 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:49.579 [2024-07-25 07:19:21.865638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:49.579 [2024-07-25 07:19:21.951429] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:49.579 [2024-07-25 07:19:22.004836] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:49.579 [2024-07-25 07:19:22.004861] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:50.148 07:19:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:50.148 07:19:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:13:50.148 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:50.407 [2024-07-25 07:19:22.842269] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:50.407 [2024-07-25 07:19:22.842307] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:50.407 [2024-07-25 07:19:22.842317] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:50.407 [2024-07-25 07:19:22.842328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.407 07:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.667 07:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.667 "name": "Existed_Raid", 00:13:50.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.667 "strip_size_kb": 0, 00:13:50.667 "state": "configuring", 00:13:50.667 "raid_level": "raid1", 00:13:50.667 "superblock": false, 00:13:50.667 "num_base_bdevs": 2, 00:13:50.667 "num_base_bdevs_discovered": 0, 00:13:50.667 "num_base_bdevs_operational": 2, 00:13:50.667 "base_bdevs_list": [ 00:13:50.667 { 00:13:50.667 "name": "BaseBdev1", 00:13:50.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.667 "is_configured": false, 00:13:50.667 "data_offset": 0, 00:13:50.667 "data_size": 0 00:13:50.667 }, 00:13:50.667 { 00:13:50.667 "name": "BaseBdev2", 00:13:50.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.667 "is_configured": false, 00:13:50.667 "data_offset": 0, 00:13:50.667 "data_size": 0 00:13:50.667 } 00:13:50.667 ] 00:13:50.667 }' 00:13:50.667 07:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.667 07:19:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.237 07:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:51.496 [2024-07-25 07:19:23.856815] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:51.496 [2024-07-25 07:19:23.856841] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16baea0 name Existed_Raid, state configuring 00:13:51.496 07:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:51.756 [2024-07-25 07:19:24.081415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:51.756 [2024-07-25 07:19:24.081441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:51.756 [2024-07-25 07:19:24.081450] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:51.756 [2024-07-25 07:19:24.081461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:51.756 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:52.016 [2024-07-25 07:19:24.315574] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:52.016 BaseBdev1 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:52.016 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:52.276 [ 00:13:52.276 { 00:13:52.276 "name": "BaseBdev1", 00:13:52.276 "aliases": [ 00:13:52.276 "ff06d8e6-224f-4e21-88dd-5608262f5845" 00:13:52.276 ], 00:13:52.276 "product_name": "Malloc disk", 00:13:52.276 "block_size": 512, 00:13:52.276 "num_blocks": 65536, 00:13:52.276 "uuid": "ff06d8e6-224f-4e21-88dd-5608262f5845", 00:13:52.276 "assigned_rate_limits": { 00:13:52.276 "rw_ios_per_sec": 0, 00:13:52.276 "rw_mbytes_per_sec": 0, 00:13:52.276 "r_mbytes_per_sec": 0, 00:13:52.276 "w_mbytes_per_sec": 0 00:13:52.276 }, 00:13:52.276 "claimed": true, 00:13:52.276 "claim_type": "exclusive_write", 00:13:52.276 "zoned": false, 00:13:52.276 "supported_io_types": { 00:13:52.276 "read": true, 00:13:52.276 "write": true, 00:13:52.276 "unmap": true, 00:13:52.276 "flush": true, 00:13:52.276 "reset": true, 00:13:52.276 "nvme_admin": false, 00:13:52.276 "nvme_io": false, 00:13:52.276 "nvme_io_md": false, 00:13:52.276 "write_zeroes": true, 00:13:52.276 "zcopy": true, 00:13:52.276 "get_zone_info": false, 00:13:52.276 "zone_management": false, 00:13:52.276 "zone_append": false, 00:13:52.276 "compare": false, 00:13:52.276 "compare_and_write": false, 00:13:52.276 "abort": true, 00:13:52.276 "seek_hole": false, 00:13:52.276 "seek_data": false, 00:13:52.276 "copy": true, 00:13:52.276 "nvme_iov_md": false 00:13:52.276 }, 00:13:52.276 "memory_domains": [ 00:13:52.276 { 00:13:52.276 "dma_device_id": "system", 00:13:52.276 "dma_device_type": 1 00:13:52.276 }, 00:13:52.276 { 00:13:52.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.276 "dma_device_type": 2 00:13:52.276 } 00:13:52.276 ], 00:13:52.276 "driver_specific": {} 00:13:52.276 } 00:13:52.276 ] 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.276 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.535 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.535 "name": "Existed_Raid", 00:13:52.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.535 "strip_size_kb": 0, 00:13:52.535 "state": "configuring", 00:13:52.535 "raid_level": "raid1", 00:13:52.535 "superblock": false, 00:13:52.535 "num_base_bdevs": 2, 00:13:52.535 "num_base_bdevs_discovered": 1, 00:13:52.535 "num_base_bdevs_operational": 2, 00:13:52.535 "base_bdevs_list": [ 00:13:52.535 { 00:13:52.535 "name": "BaseBdev1", 00:13:52.535 "uuid": "ff06d8e6-224f-4e21-88dd-5608262f5845", 00:13:52.535 "is_configured": true, 00:13:52.535 "data_offset": 0, 00:13:52.535 "data_size": 65536 00:13:52.535 }, 00:13:52.535 { 00:13:52.535 "name": "BaseBdev2", 00:13:52.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.536 "is_configured": false, 00:13:52.536 "data_offset": 0, 00:13:52.536 "data_size": 0 00:13:52.536 } 00:13:52.536 ] 00:13:52.536 }' 00:13:52.536 07:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.536 07:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.105 07:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.364 [2024-07-25 07:19:25.771414] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.364 [2024-07-25 07:19:25.771451] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ba790 name Existed_Raid, state configuring 00:13:53.364 07:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:53.624 [2024-07-25 07:19:25.996027] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.624 [2024-07-25 07:19:25.997457] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:53.624 [2024-07-25 07:19:25.997490] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.624 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.884 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.884 "name": "Existed_Raid", 00:13:53.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.884 "strip_size_kb": 0, 00:13:53.884 "state": "configuring", 00:13:53.884 "raid_level": "raid1", 00:13:53.884 "superblock": false, 00:13:53.884 "num_base_bdevs": 2, 00:13:53.884 "num_base_bdevs_discovered": 1, 00:13:53.884 "num_base_bdevs_operational": 2, 00:13:53.884 "base_bdevs_list": [ 00:13:53.884 { 00:13:53.884 "name": "BaseBdev1", 00:13:53.884 "uuid": "ff06d8e6-224f-4e21-88dd-5608262f5845", 00:13:53.884 "is_configured": true, 00:13:53.884 "data_offset": 0, 00:13:53.884 "data_size": 65536 00:13:53.884 }, 00:13:53.884 { 00:13:53.884 "name": "BaseBdev2", 00:13:53.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.884 "is_configured": false, 00:13:53.884 "data_offset": 0, 00:13:53.884 "data_size": 0 00:13:53.884 } 00:13:53.884 ] 00:13:53.884 }' 00:13:53.884 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.884 07:19:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.453 07:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:54.713 [2024-07-25 07:19:27.037936] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:54.713 [2024-07-25 07:19:27.037973] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16bb580 00:13:54.713 [2024-07-25 07:19:27.037980] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:54.713 [2024-07-25 07:19:27.038167] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16b1e20 00:13:54.713 [2024-07-25 07:19:27.038288] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16bb580 00:13:54.713 [2024-07-25 07:19:27.038297] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16bb580 00:13:54.713 [2024-07-25 07:19:27.038453] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:54.713 BaseBdev2 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:54.713 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.973 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:54.973 [ 00:13:54.973 { 00:13:54.973 "name": "BaseBdev2", 00:13:54.973 "aliases": [ 00:13:54.973 "9fccb7d8-9864-475a-9998-8ed64a29e73f" 00:13:54.973 ], 00:13:54.973 "product_name": "Malloc disk", 00:13:54.973 "block_size": 512, 00:13:54.973 "num_blocks": 65536, 00:13:54.973 "uuid": "9fccb7d8-9864-475a-9998-8ed64a29e73f", 00:13:54.973 "assigned_rate_limits": { 00:13:54.973 "rw_ios_per_sec": 0, 00:13:54.973 "rw_mbytes_per_sec": 0, 00:13:54.973 "r_mbytes_per_sec": 0, 00:13:54.973 "w_mbytes_per_sec": 0 00:13:54.973 }, 00:13:54.973 "claimed": true, 00:13:54.973 "claim_type": "exclusive_write", 00:13:54.973 "zoned": false, 00:13:54.973 "supported_io_types": { 00:13:54.973 "read": true, 00:13:54.973 "write": true, 00:13:54.973 "unmap": true, 00:13:54.973 "flush": true, 00:13:54.973 "reset": true, 00:13:54.973 "nvme_admin": false, 00:13:54.973 "nvme_io": false, 00:13:54.973 "nvme_io_md": false, 00:13:54.973 "write_zeroes": true, 00:13:54.973 "zcopy": true, 00:13:54.973 "get_zone_info": false, 00:13:54.973 "zone_management": false, 00:13:54.973 "zone_append": false, 00:13:54.973 "compare": false, 00:13:54.973 "compare_and_write": false, 00:13:54.973 "abort": true, 00:13:54.973 "seek_hole": false, 00:13:54.973 "seek_data": false, 00:13:54.973 "copy": true, 00:13:54.973 "nvme_iov_md": false 00:13:54.973 }, 00:13:54.973 "memory_domains": [ 00:13:54.973 { 00:13:54.973 "dma_device_id": "system", 00:13:54.973 "dma_device_type": 1 00:13:54.973 }, 00:13:54.973 { 00:13:54.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.973 "dma_device_type": 2 00:13:54.973 } 00:13:54.973 ], 00:13:54.973 "driver_specific": {} 00:13:54.973 } 00:13:54.973 ] 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.233 "name": "Existed_Raid", 00:13:55.233 "uuid": "8ca76b5b-6412-41d8-9d65-24fe1fa6d33e", 00:13:55.233 "strip_size_kb": 0, 00:13:55.233 "state": "online", 00:13:55.233 "raid_level": "raid1", 00:13:55.233 "superblock": false, 00:13:55.233 "num_base_bdevs": 2, 00:13:55.233 "num_base_bdevs_discovered": 2, 00:13:55.233 "num_base_bdevs_operational": 2, 00:13:55.233 "base_bdevs_list": [ 00:13:55.233 { 00:13:55.233 "name": "BaseBdev1", 00:13:55.233 "uuid": "ff06d8e6-224f-4e21-88dd-5608262f5845", 00:13:55.233 "is_configured": true, 00:13:55.233 "data_offset": 0, 00:13:55.233 "data_size": 65536 00:13:55.233 }, 00:13:55.233 { 00:13:55.233 "name": "BaseBdev2", 00:13:55.233 "uuid": "9fccb7d8-9864-475a-9998-8ed64a29e73f", 00:13:55.233 "is_configured": true, 00:13:55.233 "data_offset": 0, 00:13:55.233 "data_size": 65536 00:13:55.233 } 00:13:55.233 ] 00:13:55.233 }' 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.233 07:19:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:55.801 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:56.061 [2024-07-25 07:19:28.530124] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:56.061 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:56.061 "name": "Existed_Raid", 00:13:56.061 "aliases": [ 00:13:56.061 "8ca76b5b-6412-41d8-9d65-24fe1fa6d33e" 00:13:56.061 ], 00:13:56.061 "product_name": "Raid Volume", 00:13:56.061 "block_size": 512, 00:13:56.061 "num_blocks": 65536, 00:13:56.061 "uuid": "8ca76b5b-6412-41d8-9d65-24fe1fa6d33e", 00:13:56.061 "assigned_rate_limits": { 00:13:56.061 "rw_ios_per_sec": 0, 00:13:56.061 "rw_mbytes_per_sec": 0, 00:13:56.061 "r_mbytes_per_sec": 0, 00:13:56.061 "w_mbytes_per_sec": 0 00:13:56.061 }, 00:13:56.061 "claimed": false, 00:13:56.061 "zoned": false, 00:13:56.061 "supported_io_types": { 00:13:56.061 "read": true, 00:13:56.061 "write": true, 00:13:56.061 "unmap": false, 00:13:56.061 "flush": false, 00:13:56.061 "reset": true, 00:13:56.061 "nvme_admin": false, 00:13:56.061 "nvme_io": false, 00:13:56.061 "nvme_io_md": false, 00:13:56.061 "write_zeroes": true, 00:13:56.061 "zcopy": false, 00:13:56.061 "get_zone_info": false, 00:13:56.061 "zone_management": false, 00:13:56.061 "zone_append": false, 00:13:56.061 "compare": false, 00:13:56.061 "compare_and_write": false, 00:13:56.061 "abort": false, 00:13:56.061 "seek_hole": false, 00:13:56.061 "seek_data": false, 00:13:56.061 "copy": false, 00:13:56.061 "nvme_iov_md": false 00:13:56.061 }, 00:13:56.061 "memory_domains": [ 00:13:56.061 { 00:13:56.061 "dma_device_id": "system", 00:13:56.061 "dma_device_type": 1 00:13:56.061 }, 00:13:56.061 { 00:13:56.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.061 "dma_device_type": 2 00:13:56.061 }, 00:13:56.061 { 00:13:56.061 "dma_device_id": "system", 00:13:56.061 "dma_device_type": 1 00:13:56.061 }, 00:13:56.061 { 00:13:56.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.061 "dma_device_type": 2 00:13:56.061 } 00:13:56.061 ], 00:13:56.061 "driver_specific": { 00:13:56.061 "raid": { 00:13:56.061 "uuid": "8ca76b5b-6412-41d8-9d65-24fe1fa6d33e", 00:13:56.061 "strip_size_kb": 0, 00:13:56.061 "state": "online", 00:13:56.061 "raid_level": "raid1", 00:13:56.061 "superblock": false, 00:13:56.061 "num_base_bdevs": 2, 00:13:56.061 "num_base_bdevs_discovered": 2, 00:13:56.061 "num_base_bdevs_operational": 2, 00:13:56.061 "base_bdevs_list": [ 00:13:56.061 { 00:13:56.061 "name": "BaseBdev1", 00:13:56.061 "uuid": "ff06d8e6-224f-4e21-88dd-5608262f5845", 00:13:56.061 "is_configured": true, 00:13:56.061 "data_offset": 0, 00:13:56.061 "data_size": 65536 00:13:56.061 }, 00:13:56.061 { 00:13:56.061 "name": "BaseBdev2", 00:13:56.061 "uuid": "9fccb7d8-9864-475a-9998-8ed64a29e73f", 00:13:56.061 "is_configured": true, 00:13:56.061 "data_offset": 0, 00:13:56.061 "data_size": 65536 00:13:56.061 } 00:13:56.061 ] 00:13:56.061 } 00:13:56.061 } 00:13:56.061 }' 00:13:56.061 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:56.321 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:56.321 BaseBdev2' 00:13:56.321 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:56.321 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:56.321 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:56.321 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:56.321 "name": "BaseBdev1", 00:13:56.321 "aliases": [ 00:13:56.321 "ff06d8e6-224f-4e21-88dd-5608262f5845" 00:13:56.321 ], 00:13:56.321 "product_name": "Malloc disk", 00:13:56.321 "block_size": 512, 00:13:56.321 "num_blocks": 65536, 00:13:56.321 "uuid": "ff06d8e6-224f-4e21-88dd-5608262f5845", 00:13:56.321 "assigned_rate_limits": { 00:13:56.321 "rw_ios_per_sec": 0, 00:13:56.321 "rw_mbytes_per_sec": 0, 00:13:56.321 "r_mbytes_per_sec": 0, 00:13:56.321 "w_mbytes_per_sec": 0 00:13:56.321 }, 00:13:56.321 "claimed": true, 00:13:56.321 "claim_type": "exclusive_write", 00:13:56.321 "zoned": false, 00:13:56.321 "supported_io_types": { 00:13:56.321 "read": true, 00:13:56.321 "write": true, 00:13:56.321 "unmap": true, 00:13:56.321 "flush": true, 00:13:56.321 "reset": true, 00:13:56.321 "nvme_admin": false, 00:13:56.321 "nvme_io": false, 00:13:56.321 "nvme_io_md": false, 00:13:56.321 "write_zeroes": true, 00:13:56.321 "zcopy": true, 00:13:56.321 "get_zone_info": false, 00:13:56.321 "zone_management": false, 00:13:56.321 "zone_append": false, 00:13:56.321 "compare": false, 00:13:56.321 "compare_and_write": false, 00:13:56.321 "abort": true, 00:13:56.321 "seek_hole": false, 00:13:56.321 "seek_data": false, 00:13:56.321 "copy": true, 00:13:56.321 "nvme_iov_md": false 00:13:56.321 }, 00:13:56.321 "memory_domains": [ 00:13:56.321 { 00:13:56.321 "dma_device_id": "system", 00:13:56.321 "dma_device_type": 1 00:13:56.321 }, 00:13:56.321 { 00:13:56.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.321 "dma_device_type": 2 00:13:56.321 } 00:13:56.321 ], 00:13:56.321 "driver_specific": {} 00:13:56.321 }' 00:13:56.321 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:56.581 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:56.581 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:56.581 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:56.581 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:56.581 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:56.581 07:19:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:56.581 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:56.581 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:56.581 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:56.840 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:56.840 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:56.840 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:56.840 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:56.840 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:57.099 "name": "BaseBdev2", 00:13:57.099 "aliases": [ 00:13:57.099 "9fccb7d8-9864-475a-9998-8ed64a29e73f" 00:13:57.099 ], 00:13:57.099 "product_name": "Malloc disk", 00:13:57.099 "block_size": 512, 00:13:57.099 "num_blocks": 65536, 00:13:57.099 "uuid": "9fccb7d8-9864-475a-9998-8ed64a29e73f", 00:13:57.099 "assigned_rate_limits": { 00:13:57.099 "rw_ios_per_sec": 0, 00:13:57.099 "rw_mbytes_per_sec": 0, 00:13:57.099 "r_mbytes_per_sec": 0, 00:13:57.099 "w_mbytes_per_sec": 0 00:13:57.099 }, 00:13:57.099 "claimed": true, 00:13:57.099 "claim_type": "exclusive_write", 00:13:57.099 "zoned": false, 00:13:57.099 "supported_io_types": { 00:13:57.099 "read": true, 00:13:57.099 "write": true, 00:13:57.099 "unmap": true, 00:13:57.099 "flush": true, 00:13:57.099 "reset": true, 00:13:57.099 "nvme_admin": false, 00:13:57.099 "nvme_io": false, 00:13:57.099 "nvme_io_md": false, 00:13:57.099 "write_zeroes": true, 00:13:57.099 "zcopy": true, 00:13:57.099 "get_zone_info": false, 00:13:57.099 "zone_management": false, 00:13:57.099 "zone_append": false, 00:13:57.099 "compare": false, 00:13:57.099 "compare_and_write": false, 00:13:57.099 "abort": true, 00:13:57.099 "seek_hole": false, 00:13:57.099 "seek_data": false, 00:13:57.099 "copy": true, 00:13:57.099 "nvme_iov_md": false 00:13:57.099 }, 00:13:57.099 "memory_domains": [ 00:13:57.099 { 00:13:57.099 "dma_device_id": "system", 00:13:57.099 "dma_device_type": 1 00:13:57.099 }, 00:13:57.099 { 00:13:57.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.099 "dma_device_type": 2 00:13:57.099 } 00:13:57.099 ], 00:13:57.099 "driver_specific": {} 00:13:57.099 }' 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:57.099 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:57.358 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:57.358 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:57.358 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:57.358 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:57.358 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:57.617 [2024-07-25 07:19:29.953658] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.617 07:19:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.876 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.876 "name": "Existed_Raid", 00:13:57.876 "uuid": "8ca76b5b-6412-41d8-9d65-24fe1fa6d33e", 00:13:57.876 "strip_size_kb": 0, 00:13:57.876 "state": "online", 00:13:57.876 "raid_level": "raid1", 00:13:57.876 "superblock": false, 00:13:57.876 "num_base_bdevs": 2, 00:13:57.876 "num_base_bdevs_discovered": 1, 00:13:57.876 "num_base_bdevs_operational": 1, 00:13:57.876 "base_bdevs_list": [ 00:13:57.876 { 00:13:57.876 "name": null, 00:13:57.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.876 "is_configured": false, 00:13:57.876 "data_offset": 0, 00:13:57.876 "data_size": 65536 00:13:57.876 }, 00:13:57.876 { 00:13:57.876 "name": "BaseBdev2", 00:13:57.876 "uuid": "9fccb7d8-9864-475a-9998-8ed64a29e73f", 00:13:57.876 "is_configured": true, 00:13:57.876 "data_offset": 0, 00:13:57.876 "data_size": 65536 00:13:57.876 } 00:13:57.876 ] 00:13:57.876 }' 00:13:57.876 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.876 07:19:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.444 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:58.444 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:58.444 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.444 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:58.703 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:58.703 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:58.703 07:19:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:58.703 [2024-07-25 07:19:31.210022] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:58.703 [2024-07-25 07:19:31.210103] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:58.703 [2024-07-25 07:19:31.220557] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:58.703 [2024-07-25 07:19:31.220590] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:58.703 [2024-07-25 07:19:31.220600] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16bb580 name Existed_Raid, state offline 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1603083 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1603083 ']' 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1603083 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:58.962 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1603083 00:13:59.221 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:59.221 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:59.221 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1603083' 00:13:59.221 killing process with pid 1603083 00:13:59.221 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1603083 00:13:59.222 [2024-07-25 07:19:31.524560] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:59.222 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1603083 00:13:59.222 [2024-07-25 07:19:31.525417] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:59.222 07:19:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:59.222 00:13:59.222 real 0m10.047s 00:13:59.222 user 0m17.861s 00:13:59.222 sys 0m1.877s 00:13:59.222 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:59.222 07:19:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.222 ************************************ 00:13:59.222 END TEST raid_state_function_test 00:13:59.222 ************************************ 00:13:59.481 07:19:31 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:59.481 07:19:31 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:59.481 07:19:31 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:59.482 07:19:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:59.482 ************************************ 00:13:59.482 START TEST raid_state_function_test_sb 00:13:59.482 ************************************ 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1605051 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1605051' 00:13:59.482 Process raid pid: 1605051 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1605051 /var/tmp/spdk-raid.sock 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1605051 ']' 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:59.482 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:59.482 07:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.482 [2024-07-25 07:19:31.856377] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:13:59.482 [2024-07-25 07:19:31.856433] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:59.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.482 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:59.483 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:59.483 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:59.483 [2024-07-25 07:19:31.988577] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:59.742 [2024-07-25 07:19:32.075066] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:13:59.742 [2024-07-25 07:19:32.132249] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:59.742 [2024-07-25 07:19:32.132278] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:00.310 07:19:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:00.310 07:19:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:14:00.310 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:00.597 [2024-07-25 07:19:32.891022] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:00.597 [2024-07-25 07:19:32.891060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:00.597 [2024-07-25 07:19:32.891074] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:00.597 [2024-07-25 07:19:32.891085] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.597 07:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.856 07:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.856 "name": "Existed_Raid", 00:14:00.856 "uuid": "1642bfd3-54c4-4b70-ae51-1fda8695091f", 00:14:00.856 "strip_size_kb": 0, 00:14:00.856 "state": "configuring", 00:14:00.856 "raid_level": "raid1", 00:14:00.856 "superblock": true, 00:14:00.857 "num_base_bdevs": 2, 00:14:00.857 "num_base_bdevs_discovered": 0, 00:14:00.857 "num_base_bdevs_operational": 2, 00:14:00.857 "base_bdevs_list": [ 00:14:00.857 { 00:14:00.857 "name": "BaseBdev1", 00:14:00.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.857 "is_configured": false, 00:14:00.857 "data_offset": 0, 00:14:00.857 "data_size": 0 00:14:00.857 }, 00:14:00.857 { 00:14:00.857 "name": "BaseBdev2", 00:14:00.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.857 "is_configured": false, 00:14:00.857 "data_offset": 0, 00:14:00.857 "data_size": 0 00:14:00.857 } 00:14:00.857 ] 00:14:00.857 }' 00:14:00.857 07:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.857 07:19:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.424 07:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:01.424 [2024-07-25 07:19:33.913587] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:01.424 [2024-07-25 07:19:33.913613] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1594ea0 name Existed_Raid, state configuring 00:14:01.424 07:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:01.683 [2024-07-25 07:19:34.142309] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:01.683 [2024-07-25 07:19:34.142334] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:01.683 [2024-07-25 07:19:34.142343] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:01.683 [2024-07-25 07:19:34.142353] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:01.683 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:01.942 [2024-07-25 07:19:34.384435] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:01.942 BaseBdev1 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:01.942 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:02.201 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:02.460 [ 00:14:02.460 { 00:14:02.460 "name": "BaseBdev1", 00:14:02.460 "aliases": [ 00:14:02.460 "bebbb1a9-2099-4d32-9289-456f9e7fe699" 00:14:02.460 ], 00:14:02.460 "product_name": "Malloc disk", 00:14:02.460 "block_size": 512, 00:14:02.460 "num_blocks": 65536, 00:14:02.460 "uuid": "bebbb1a9-2099-4d32-9289-456f9e7fe699", 00:14:02.460 "assigned_rate_limits": { 00:14:02.460 "rw_ios_per_sec": 0, 00:14:02.460 "rw_mbytes_per_sec": 0, 00:14:02.460 "r_mbytes_per_sec": 0, 00:14:02.460 "w_mbytes_per_sec": 0 00:14:02.460 }, 00:14:02.460 "claimed": true, 00:14:02.460 "claim_type": "exclusive_write", 00:14:02.460 "zoned": false, 00:14:02.460 "supported_io_types": { 00:14:02.460 "read": true, 00:14:02.460 "write": true, 00:14:02.461 "unmap": true, 00:14:02.461 "flush": true, 00:14:02.461 "reset": true, 00:14:02.461 "nvme_admin": false, 00:14:02.461 "nvme_io": false, 00:14:02.461 "nvme_io_md": false, 00:14:02.461 "write_zeroes": true, 00:14:02.461 "zcopy": true, 00:14:02.461 "get_zone_info": false, 00:14:02.461 "zone_management": false, 00:14:02.461 "zone_append": false, 00:14:02.461 "compare": false, 00:14:02.461 "compare_and_write": false, 00:14:02.461 "abort": true, 00:14:02.461 "seek_hole": false, 00:14:02.461 "seek_data": false, 00:14:02.461 "copy": true, 00:14:02.461 "nvme_iov_md": false 00:14:02.461 }, 00:14:02.461 "memory_domains": [ 00:14:02.461 { 00:14:02.461 "dma_device_id": "system", 00:14:02.461 "dma_device_type": 1 00:14:02.461 }, 00:14:02.461 { 00:14:02.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.461 "dma_device_type": 2 00:14:02.461 } 00:14:02.461 ], 00:14:02.461 "driver_specific": {} 00:14:02.461 } 00:14:02.461 ] 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.461 07:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.720 07:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.720 "name": "Existed_Raid", 00:14:02.720 "uuid": "e545534d-a70f-4e8b-8606-8d7e4678b689", 00:14:02.720 "strip_size_kb": 0, 00:14:02.720 "state": "configuring", 00:14:02.720 "raid_level": "raid1", 00:14:02.720 "superblock": true, 00:14:02.720 "num_base_bdevs": 2, 00:14:02.720 "num_base_bdevs_discovered": 1, 00:14:02.721 "num_base_bdevs_operational": 2, 00:14:02.721 "base_bdevs_list": [ 00:14:02.721 { 00:14:02.721 "name": "BaseBdev1", 00:14:02.721 "uuid": "bebbb1a9-2099-4d32-9289-456f9e7fe699", 00:14:02.721 "is_configured": true, 00:14:02.721 "data_offset": 2048, 00:14:02.721 "data_size": 63488 00:14:02.721 }, 00:14:02.721 { 00:14:02.721 "name": "BaseBdev2", 00:14:02.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.721 "is_configured": false, 00:14:02.721 "data_offset": 0, 00:14:02.721 "data_size": 0 00:14:02.721 } 00:14:02.721 ] 00:14:02.721 }' 00:14:02.721 07:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.721 07:19:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.290 07:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:03.549 [2024-07-25 07:19:35.872448] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:03.549 [2024-07-25 07:19:35.872481] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1594790 name Existed_Raid, state configuring 00:14:03.549 07:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:03.808 [2024-07-25 07:19:36.101080] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:03.808 [2024-07-25 07:19:36.102475] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:03.808 [2024-07-25 07:19:36.102507] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:03.808 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:03.809 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.809 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.809 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.809 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.809 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.809 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.068 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.068 "name": "Existed_Raid", 00:14:04.068 "uuid": "00b728b9-c7bd-44ac-b069-da737823c8ee", 00:14:04.068 "strip_size_kb": 0, 00:14:04.068 "state": "configuring", 00:14:04.068 "raid_level": "raid1", 00:14:04.068 "superblock": true, 00:14:04.068 "num_base_bdevs": 2, 00:14:04.068 "num_base_bdevs_discovered": 1, 00:14:04.068 "num_base_bdevs_operational": 2, 00:14:04.068 "base_bdevs_list": [ 00:14:04.068 { 00:14:04.068 "name": "BaseBdev1", 00:14:04.068 "uuid": "bebbb1a9-2099-4d32-9289-456f9e7fe699", 00:14:04.068 "is_configured": true, 00:14:04.068 "data_offset": 2048, 00:14:04.068 "data_size": 63488 00:14:04.068 }, 00:14:04.068 { 00:14:04.068 "name": "BaseBdev2", 00:14:04.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.068 "is_configured": false, 00:14:04.068 "data_offset": 0, 00:14:04.068 "data_size": 0 00:14:04.068 } 00:14:04.068 ] 00:14:04.068 }' 00:14:04.068 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.068 07:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:04.636 07:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:04.636 [2024-07-25 07:19:37.030561] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:04.636 [2024-07-25 07:19:37.030695] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1595580 00:14:04.636 [2024-07-25 07:19:37.030708] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:04.636 [2024-07-25 07:19:37.030865] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1596b20 00:14:04.636 [2024-07-25 07:19:37.030983] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1595580 00:14:04.636 [2024-07-25 07:19:37.030992] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1595580 00:14:04.636 [2024-07-25 07:19:37.031077] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:04.636 BaseBdev2 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:04.636 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.895 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:05.155 [ 00:14:05.155 { 00:14:05.155 "name": "BaseBdev2", 00:14:05.155 "aliases": [ 00:14:05.155 "a7b2ccd5-c288-446f-9923-985dd83dfb26" 00:14:05.155 ], 00:14:05.155 "product_name": "Malloc disk", 00:14:05.155 "block_size": 512, 00:14:05.155 "num_blocks": 65536, 00:14:05.155 "uuid": "a7b2ccd5-c288-446f-9923-985dd83dfb26", 00:14:05.155 "assigned_rate_limits": { 00:14:05.155 "rw_ios_per_sec": 0, 00:14:05.155 "rw_mbytes_per_sec": 0, 00:14:05.155 "r_mbytes_per_sec": 0, 00:14:05.155 "w_mbytes_per_sec": 0 00:14:05.155 }, 00:14:05.155 "claimed": true, 00:14:05.155 "claim_type": "exclusive_write", 00:14:05.155 "zoned": false, 00:14:05.155 "supported_io_types": { 00:14:05.155 "read": true, 00:14:05.155 "write": true, 00:14:05.155 "unmap": true, 00:14:05.155 "flush": true, 00:14:05.155 "reset": true, 00:14:05.155 "nvme_admin": false, 00:14:05.155 "nvme_io": false, 00:14:05.155 "nvme_io_md": false, 00:14:05.155 "write_zeroes": true, 00:14:05.155 "zcopy": true, 00:14:05.155 "get_zone_info": false, 00:14:05.155 "zone_management": false, 00:14:05.155 "zone_append": false, 00:14:05.155 "compare": false, 00:14:05.155 "compare_and_write": false, 00:14:05.155 "abort": true, 00:14:05.155 "seek_hole": false, 00:14:05.155 "seek_data": false, 00:14:05.155 "copy": true, 00:14:05.155 "nvme_iov_md": false 00:14:05.155 }, 00:14:05.155 "memory_domains": [ 00:14:05.155 { 00:14:05.155 "dma_device_id": "system", 00:14:05.155 "dma_device_type": 1 00:14:05.155 }, 00:14:05.155 { 00:14:05.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.155 "dma_device_type": 2 00:14:05.155 } 00:14:05.155 ], 00:14:05.155 "driver_specific": {} 00:14:05.155 } 00:14:05.155 ] 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.155 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.414 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.414 "name": "Existed_Raid", 00:14:05.414 "uuid": "00b728b9-c7bd-44ac-b069-da737823c8ee", 00:14:05.414 "strip_size_kb": 0, 00:14:05.414 "state": "online", 00:14:05.414 "raid_level": "raid1", 00:14:05.414 "superblock": true, 00:14:05.414 "num_base_bdevs": 2, 00:14:05.414 "num_base_bdevs_discovered": 2, 00:14:05.414 "num_base_bdevs_operational": 2, 00:14:05.414 "base_bdevs_list": [ 00:14:05.414 { 00:14:05.414 "name": "BaseBdev1", 00:14:05.414 "uuid": "bebbb1a9-2099-4d32-9289-456f9e7fe699", 00:14:05.414 "is_configured": true, 00:14:05.414 "data_offset": 2048, 00:14:05.414 "data_size": 63488 00:14:05.414 }, 00:14:05.414 { 00:14:05.414 "name": "BaseBdev2", 00:14:05.414 "uuid": "a7b2ccd5-c288-446f-9923-985dd83dfb26", 00:14:05.414 "is_configured": true, 00:14:05.414 "data_offset": 2048, 00:14:05.414 "data_size": 63488 00:14:05.414 } 00:14:05.414 ] 00:14:05.414 }' 00:14:05.414 07:19:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.414 07:19:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:05.983 [2024-07-25 07:19:38.454534] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:05.983 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:05.983 "name": "Existed_Raid", 00:14:05.983 "aliases": [ 00:14:05.983 "00b728b9-c7bd-44ac-b069-da737823c8ee" 00:14:05.983 ], 00:14:05.983 "product_name": "Raid Volume", 00:14:05.983 "block_size": 512, 00:14:05.983 "num_blocks": 63488, 00:14:05.983 "uuid": "00b728b9-c7bd-44ac-b069-da737823c8ee", 00:14:05.983 "assigned_rate_limits": { 00:14:05.983 "rw_ios_per_sec": 0, 00:14:05.983 "rw_mbytes_per_sec": 0, 00:14:05.983 "r_mbytes_per_sec": 0, 00:14:05.983 "w_mbytes_per_sec": 0 00:14:05.983 }, 00:14:05.983 "claimed": false, 00:14:05.983 "zoned": false, 00:14:05.983 "supported_io_types": { 00:14:05.983 "read": true, 00:14:05.983 "write": true, 00:14:05.983 "unmap": false, 00:14:05.983 "flush": false, 00:14:05.983 "reset": true, 00:14:05.983 "nvme_admin": false, 00:14:05.983 "nvme_io": false, 00:14:05.983 "nvme_io_md": false, 00:14:05.983 "write_zeroes": true, 00:14:05.983 "zcopy": false, 00:14:05.983 "get_zone_info": false, 00:14:05.983 "zone_management": false, 00:14:05.983 "zone_append": false, 00:14:05.983 "compare": false, 00:14:05.983 "compare_and_write": false, 00:14:05.983 "abort": false, 00:14:05.983 "seek_hole": false, 00:14:05.983 "seek_data": false, 00:14:05.983 "copy": false, 00:14:05.983 "nvme_iov_md": false 00:14:05.983 }, 00:14:05.983 "memory_domains": [ 00:14:05.983 { 00:14:05.983 "dma_device_id": "system", 00:14:05.983 "dma_device_type": 1 00:14:05.983 }, 00:14:05.983 { 00:14:05.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.983 "dma_device_type": 2 00:14:05.983 }, 00:14:05.983 { 00:14:05.983 "dma_device_id": "system", 00:14:05.983 "dma_device_type": 1 00:14:05.983 }, 00:14:05.983 { 00:14:05.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.983 "dma_device_type": 2 00:14:05.983 } 00:14:05.983 ], 00:14:05.983 "driver_specific": { 00:14:05.983 "raid": { 00:14:05.983 "uuid": "00b728b9-c7bd-44ac-b069-da737823c8ee", 00:14:05.983 "strip_size_kb": 0, 00:14:05.983 "state": "online", 00:14:05.983 "raid_level": "raid1", 00:14:05.984 "superblock": true, 00:14:05.984 "num_base_bdevs": 2, 00:14:05.984 "num_base_bdevs_discovered": 2, 00:14:05.984 "num_base_bdevs_operational": 2, 00:14:05.984 "base_bdevs_list": [ 00:14:05.984 { 00:14:05.984 "name": "BaseBdev1", 00:14:05.984 "uuid": "bebbb1a9-2099-4d32-9289-456f9e7fe699", 00:14:05.984 "is_configured": true, 00:14:05.984 "data_offset": 2048, 00:14:05.984 "data_size": 63488 00:14:05.984 }, 00:14:05.984 { 00:14:05.984 "name": "BaseBdev2", 00:14:05.984 "uuid": "a7b2ccd5-c288-446f-9923-985dd83dfb26", 00:14:05.984 "is_configured": true, 00:14:05.984 "data_offset": 2048, 00:14:05.984 "data_size": 63488 00:14:05.984 } 00:14:05.984 ] 00:14:05.984 } 00:14:05.984 } 00:14:05.984 }' 00:14:05.984 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:06.243 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:06.243 BaseBdev2' 00:14:06.243 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.243 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:06.243 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.243 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.243 "name": "BaseBdev1", 00:14:06.243 "aliases": [ 00:14:06.243 "bebbb1a9-2099-4d32-9289-456f9e7fe699" 00:14:06.243 ], 00:14:06.243 "product_name": "Malloc disk", 00:14:06.243 "block_size": 512, 00:14:06.243 "num_blocks": 65536, 00:14:06.243 "uuid": "bebbb1a9-2099-4d32-9289-456f9e7fe699", 00:14:06.243 "assigned_rate_limits": { 00:14:06.243 "rw_ios_per_sec": 0, 00:14:06.243 "rw_mbytes_per_sec": 0, 00:14:06.243 "r_mbytes_per_sec": 0, 00:14:06.243 "w_mbytes_per_sec": 0 00:14:06.243 }, 00:14:06.243 "claimed": true, 00:14:06.243 "claim_type": "exclusive_write", 00:14:06.243 "zoned": false, 00:14:06.243 "supported_io_types": { 00:14:06.243 "read": true, 00:14:06.243 "write": true, 00:14:06.243 "unmap": true, 00:14:06.243 "flush": true, 00:14:06.243 "reset": true, 00:14:06.243 "nvme_admin": false, 00:14:06.243 "nvme_io": false, 00:14:06.243 "nvme_io_md": false, 00:14:06.243 "write_zeroes": true, 00:14:06.243 "zcopy": true, 00:14:06.243 "get_zone_info": false, 00:14:06.243 "zone_management": false, 00:14:06.243 "zone_append": false, 00:14:06.243 "compare": false, 00:14:06.243 "compare_and_write": false, 00:14:06.243 "abort": true, 00:14:06.243 "seek_hole": false, 00:14:06.243 "seek_data": false, 00:14:06.243 "copy": true, 00:14:06.243 "nvme_iov_md": false 00:14:06.243 }, 00:14:06.243 "memory_domains": [ 00:14:06.243 { 00:14:06.243 "dma_device_id": "system", 00:14:06.243 "dma_device_type": 1 00:14:06.243 }, 00:14:06.243 { 00:14:06.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.243 "dma_device_type": 2 00:14:06.243 } 00:14:06.243 ], 00:14:06.243 "driver_specific": {} 00:14:06.243 }' 00:14:06.243 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.503 07:19:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.762 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.762 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.762 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.763 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:06.763 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.022 "name": "BaseBdev2", 00:14:07.022 "aliases": [ 00:14:07.022 "a7b2ccd5-c288-446f-9923-985dd83dfb26" 00:14:07.022 ], 00:14:07.022 "product_name": "Malloc disk", 00:14:07.022 "block_size": 512, 00:14:07.022 "num_blocks": 65536, 00:14:07.022 "uuid": "a7b2ccd5-c288-446f-9923-985dd83dfb26", 00:14:07.022 "assigned_rate_limits": { 00:14:07.022 "rw_ios_per_sec": 0, 00:14:07.022 "rw_mbytes_per_sec": 0, 00:14:07.022 "r_mbytes_per_sec": 0, 00:14:07.022 "w_mbytes_per_sec": 0 00:14:07.022 }, 00:14:07.022 "claimed": true, 00:14:07.022 "claim_type": "exclusive_write", 00:14:07.022 "zoned": false, 00:14:07.022 "supported_io_types": { 00:14:07.022 "read": true, 00:14:07.022 "write": true, 00:14:07.022 "unmap": true, 00:14:07.022 "flush": true, 00:14:07.022 "reset": true, 00:14:07.022 "nvme_admin": false, 00:14:07.022 "nvme_io": false, 00:14:07.022 "nvme_io_md": false, 00:14:07.022 "write_zeroes": true, 00:14:07.022 "zcopy": true, 00:14:07.022 "get_zone_info": false, 00:14:07.022 "zone_management": false, 00:14:07.022 "zone_append": false, 00:14:07.022 "compare": false, 00:14:07.022 "compare_and_write": false, 00:14:07.022 "abort": true, 00:14:07.022 "seek_hole": false, 00:14:07.022 "seek_data": false, 00:14:07.022 "copy": true, 00:14:07.022 "nvme_iov_md": false 00:14:07.022 }, 00:14:07.022 "memory_domains": [ 00:14:07.022 { 00:14:07.022 "dma_device_id": "system", 00:14:07.022 "dma_device_type": 1 00:14:07.022 }, 00:14:07.022 { 00:14:07.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.022 "dma_device_type": 2 00:14:07.022 } 00:14:07.022 ], 00:14:07.022 "driver_specific": {} 00:14:07.022 }' 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.022 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.281 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.281 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.281 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.281 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.281 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:07.541 [2024-07-25 07:19:39.858025] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.541 07:19:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.801 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.801 "name": "Existed_Raid", 00:14:07.801 "uuid": "00b728b9-c7bd-44ac-b069-da737823c8ee", 00:14:07.801 "strip_size_kb": 0, 00:14:07.801 "state": "online", 00:14:07.801 "raid_level": "raid1", 00:14:07.801 "superblock": true, 00:14:07.801 "num_base_bdevs": 2, 00:14:07.801 "num_base_bdevs_discovered": 1, 00:14:07.801 "num_base_bdevs_operational": 1, 00:14:07.801 "base_bdevs_list": [ 00:14:07.801 { 00:14:07.801 "name": null, 00:14:07.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.801 "is_configured": false, 00:14:07.801 "data_offset": 2048, 00:14:07.801 "data_size": 63488 00:14:07.801 }, 00:14:07.801 { 00:14:07.801 "name": "BaseBdev2", 00:14:07.801 "uuid": "a7b2ccd5-c288-446f-9923-985dd83dfb26", 00:14:07.801 "is_configured": true, 00:14:07.801 "data_offset": 2048, 00:14:07.801 "data_size": 63488 00:14:07.801 } 00:14:07.801 ] 00:14:07.801 }' 00:14:07.801 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.801 07:19:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.369 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:08.370 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:08.370 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.370 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:08.629 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:08.629 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:08.629 07:19:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:08.629 [2024-07-25 07:19:41.058304] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:08.629 [2024-07-25 07:19:41.058378] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:08.629 [2024-07-25 07:19:41.068790] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:08.629 [2024-07-25 07:19:41.068821] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:08.629 [2024-07-25 07:19:41.068831] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1595580 name Existed_Raid, state offline 00:14:08.629 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:08.629 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:08.629 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.629 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:08.888 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1605051 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1605051 ']' 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1605051 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1605051 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1605051' 00:14:08.889 killing process with pid 1605051 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1605051 00:14:08.889 [2024-07-25 07:19:41.294980] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:08.889 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1605051 00:14:08.889 [2024-07-25 07:19:41.295819] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:09.148 07:19:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:09.148 00:14:09.148 real 0m9.689s 00:14:09.148 user 0m17.230s 00:14:09.148 sys 0m1.816s 00:14:09.148 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:09.148 07:19:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.148 ************************************ 00:14:09.148 END TEST raid_state_function_test_sb 00:14:09.148 ************************************ 00:14:09.148 07:19:41 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:14:09.148 07:19:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:14:09.148 07:19:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:09.148 07:19:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:09.148 ************************************ 00:14:09.148 START TEST raid_superblock_test 00:14:09.148 ************************************ 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1606872 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1606872 /var/tmp/spdk-raid.sock 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1606872 ']' 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:09.148 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:09.148 07:19:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.148 [2024-07-25 07:19:41.607565] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:14:09.149 [2024-07-25 07:19:41.607617] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606872 ] 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.149 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:09.149 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:09.408 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:09.408 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:09.408 [2024-07-25 07:19:41.740750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:09.408 [2024-07-25 07:19:41.826009] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:14:09.408 [2024-07-25 07:19:41.887060] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:09.408 [2024-07-25 07:19:41.887098] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:09.977 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:10.236 malloc1 00:14:10.236 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:10.496 [2024-07-25 07:19:42.861061] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:10.496 [2024-07-25 07:19:42.861103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.496 [2024-07-25 07:19:42.861122] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0b280 00:14:10.496 [2024-07-25 07:19:42.861134] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.496 [2024-07-25 07:19:42.862643] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.496 [2024-07-25 07:19:42.862671] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:10.496 pt1 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:10.496 07:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:10.496 malloc2 00:14:10.755 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:10.755 [2024-07-25 07:19:43.238636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:10.755 [2024-07-25 07:19:43.238675] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.755 [2024-07-25 07:19:43.238694] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb68c0 00:14:10.755 [2024-07-25 07:19:43.238706] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.755 [2024-07-25 07:19:43.240046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.755 [2024-07-25 07:19:43.240072] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:10.755 pt2 00:14:10.755 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:10.755 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:10.755 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:14:11.013 [2024-07-25 07:19:43.399071] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:11.013 [2024-07-25 07:19:43.400225] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:11.013 [2024-07-25 07:19:43.400361] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb4720 00:14:11.013 [2024-07-25 07:19:43.400373] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:11.013 [2024-07-25 07:19:43.400546] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0c6e0 00:14:11.013 [2024-07-25 07:19:43.400679] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb4720 00:14:11.013 [2024-07-25 07:19:43.400689] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfb4720 00:14:11.013 [2024-07-25 07:19:43.400777] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.013 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:11.272 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.272 "name": "raid_bdev1", 00:14:11.272 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:11.272 "strip_size_kb": 0, 00:14:11.272 "state": "online", 00:14:11.272 "raid_level": "raid1", 00:14:11.272 "superblock": true, 00:14:11.272 "num_base_bdevs": 2, 00:14:11.272 "num_base_bdevs_discovered": 2, 00:14:11.272 "num_base_bdevs_operational": 2, 00:14:11.272 "base_bdevs_list": [ 00:14:11.272 { 00:14:11.272 "name": "pt1", 00:14:11.272 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.272 "is_configured": true, 00:14:11.272 "data_offset": 2048, 00:14:11.272 "data_size": 63488 00:14:11.272 }, 00:14:11.272 { 00:14:11.272 "name": "pt2", 00:14:11.272 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.272 "is_configured": true, 00:14:11.272 "data_offset": 2048, 00:14:11.272 "data_size": 63488 00:14:11.272 } 00:14:11.272 ] 00:14:11.272 }' 00:14:11.272 07:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.272 07:19:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:12.209 [2024-07-25 07:19:44.630517] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:12.209 "name": "raid_bdev1", 00:14:12.209 "aliases": [ 00:14:12.209 "238a161e-3b83-4bcd-be94-415d12e4aec7" 00:14:12.209 ], 00:14:12.209 "product_name": "Raid Volume", 00:14:12.209 "block_size": 512, 00:14:12.209 "num_blocks": 63488, 00:14:12.209 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:12.209 "assigned_rate_limits": { 00:14:12.209 "rw_ios_per_sec": 0, 00:14:12.209 "rw_mbytes_per_sec": 0, 00:14:12.209 "r_mbytes_per_sec": 0, 00:14:12.209 "w_mbytes_per_sec": 0 00:14:12.209 }, 00:14:12.209 "claimed": false, 00:14:12.209 "zoned": false, 00:14:12.209 "supported_io_types": { 00:14:12.209 "read": true, 00:14:12.209 "write": true, 00:14:12.209 "unmap": false, 00:14:12.209 "flush": false, 00:14:12.209 "reset": true, 00:14:12.209 "nvme_admin": false, 00:14:12.209 "nvme_io": false, 00:14:12.209 "nvme_io_md": false, 00:14:12.209 "write_zeroes": true, 00:14:12.209 "zcopy": false, 00:14:12.209 "get_zone_info": false, 00:14:12.209 "zone_management": false, 00:14:12.209 "zone_append": false, 00:14:12.209 "compare": false, 00:14:12.209 "compare_and_write": false, 00:14:12.209 "abort": false, 00:14:12.209 "seek_hole": false, 00:14:12.209 "seek_data": false, 00:14:12.209 "copy": false, 00:14:12.209 "nvme_iov_md": false 00:14:12.209 }, 00:14:12.209 "memory_domains": [ 00:14:12.209 { 00:14:12.209 "dma_device_id": "system", 00:14:12.209 "dma_device_type": 1 00:14:12.209 }, 00:14:12.209 { 00:14:12.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.209 "dma_device_type": 2 00:14:12.209 }, 00:14:12.209 { 00:14:12.209 "dma_device_id": "system", 00:14:12.209 "dma_device_type": 1 00:14:12.209 }, 00:14:12.209 { 00:14:12.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.209 "dma_device_type": 2 00:14:12.209 } 00:14:12.209 ], 00:14:12.209 "driver_specific": { 00:14:12.209 "raid": { 00:14:12.209 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:12.209 "strip_size_kb": 0, 00:14:12.209 "state": "online", 00:14:12.209 "raid_level": "raid1", 00:14:12.209 "superblock": true, 00:14:12.209 "num_base_bdevs": 2, 00:14:12.209 "num_base_bdevs_discovered": 2, 00:14:12.209 "num_base_bdevs_operational": 2, 00:14:12.209 "base_bdevs_list": [ 00:14:12.209 { 00:14:12.209 "name": "pt1", 00:14:12.209 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.209 "is_configured": true, 00:14:12.209 "data_offset": 2048, 00:14:12.209 "data_size": 63488 00:14:12.209 }, 00:14:12.209 { 00:14:12.209 "name": "pt2", 00:14:12.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.209 "is_configured": true, 00:14:12.209 "data_offset": 2048, 00:14:12.209 "data_size": 63488 00:14:12.209 } 00:14:12.209 ] 00:14:12.209 } 00:14:12.209 } 00:14:12.209 }' 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:12.209 pt2' 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:12.209 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.468 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.468 "name": "pt1", 00:14:12.468 "aliases": [ 00:14:12.468 "00000000-0000-0000-0000-000000000001" 00:14:12.468 ], 00:14:12.468 "product_name": "passthru", 00:14:12.468 "block_size": 512, 00:14:12.468 "num_blocks": 65536, 00:14:12.468 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.468 "assigned_rate_limits": { 00:14:12.468 "rw_ios_per_sec": 0, 00:14:12.468 "rw_mbytes_per_sec": 0, 00:14:12.468 "r_mbytes_per_sec": 0, 00:14:12.468 "w_mbytes_per_sec": 0 00:14:12.468 }, 00:14:12.468 "claimed": true, 00:14:12.468 "claim_type": "exclusive_write", 00:14:12.468 "zoned": false, 00:14:12.468 "supported_io_types": { 00:14:12.468 "read": true, 00:14:12.468 "write": true, 00:14:12.468 "unmap": true, 00:14:12.468 "flush": true, 00:14:12.468 "reset": true, 00:14:12.468 "nvme_admin": false, 00:14:12.468 "nvme_io": false, 00:14:12.468 "nvme_io_md": false, 00:14:12.468 "write_zeroes": true, 00:14:12.468 "zcopy": true, 00:14:12.468 "get_zone_info": false, 00:14:12.468 "zone_management": false, 00:14:12.468 "zone_append": false, 00:14:12.468 "compare": false, 00:14:12.468 "compare_and_write": false, 00:14:12.468 "abort": true, 00:14:12.468 "seek_hole": false, 00:14:12.468 "seek_data": false, 00:14:12.468 "copy": true, 00:14:12.468 "nvme_iov_md": false 00:14:12.468 }, 00:14:12.468 "memory_domains": [ 00:14:12.468 { 00:14:12.468 "dma_device_id": "system", 00:14:12.468 "dma_device_type": 1 00:14:12.468 }, 00:14:12.468 { 00:14:12.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.468 "dma_device_type": 2 00:14:12.468 } 00:14:12.468 ], 00:14:12.469 "driver_specific": { 00:14:12.469 "passthru": { 00:14:12.469 "name": "pt1", 00:14:12.469 "base_bdev_name": "malloc1" 00:14:12.469 } 00:14:12.469 } 00:14:12.469 }' 00:14:12.469 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.469 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.469 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.469 07:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.726 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.985 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.985 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.985 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:12.985 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.985 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.985 "name": "pt2", 00:14:12.985 "aliases": [ 00:14:12.985 "00000000-0000-0000-0000-000000000002" 00:14:12.985 ], 00:14:12.985 "product_name": "passthru", 00:14:12.985 "block_size": 512, 00:14:12.986 "num_blocks": 65536, 00:14:12.986 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.986 "assigned_rate_limits": { 00:14:12.986 "rw_ios_per_sec": 0, 00:14:12.986 "rw_mbytes_per_sec": 0, 00:14:12.986 "r_mbytes_per_sec": 0, 00:14:12.986 "w_mbytes_per_sec": 0 00:14:12.986 }, 00:14:12.986 "claimed": true, 00:14:12.986 "claim_type": "exclusive_write", 00:14:12.986 "zoned": false, 00:14:12.986 "supported_io_types": { 00:14:12.986 "read": true, 00:14:12.986 "write": true, 00:14:12.986 "unmap": true, 00:14:12.986 "flush": true, 00:14:12.986 "reset": true, 00:14:12.986 "nvme_admin": false, 00:14:12.986 "nvme_io": false, 00:14:12.986 "nvme_io_md": false, 00:14:12.986 "write_zeroes": true, 00:14:12.986 "zcopy": true, 00:14:12.986 "get_zone_info": false, 00:14:12.986 "zone_management": false, 00:14:12.986 "zone_append": false, 00:14:12.986 "compare": false, 00:14:12.986 "compare_and_write": false, 00:14:12.986 "abort": true, 00:14:12.986 "seek_hole": false, 00:14:12.986 "seek_data": false, 00:14:12.986 "copy": true, 00:14:12.986 "nvme_iov_md": false 00:14:12.986 }, 00:14:12.986 "memory_domains": [ 00:14:12.986 { 00:14:12.986 "dma_device_id": "system", 00:14:12.986 "dma_device_type": 1 00:14:12.986 }, 00:14:12.986 { 00:14:12.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.986 "dma_device_type": 2 00:14:12.986 } 00:14:12.986 ], 00:14:12.986 "driver_specific": { 00:14:12.986 "passthru": { 00:14:12.986 "name": "pt2", 00:14:12.986 "base_bdev_name": "malloc2" 00:14:12.986 } 00:14:12.986 } 00:14:12.986 }' 00:14:12.986 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.244 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.504 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.504 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.504 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.504 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.504 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:13.504 07:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:14:13.763 [2024-07-25 07:19:46.054277] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.763 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=238a161e-3b83-4bcd-be94-415d12e4aec7 00:14:13.763 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 238a161e-3b83-4bcd-be94-415d12e4aec7 ']' 00:14:13.763 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:13.763 [2024-07-25 07:19:46.282640] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:13.763 [2024-07-25 07:19:46.282657] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:13.763 [2024-07-25 07:19:46.282707] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:13.763 [2024-07-25 07:19:46.282758] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:13.763 [2024-07-25 07:19:46.282768] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb4720 name raid_bdev1, state offline 00:14:14.022 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.022 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:14:14.022 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:14:14.022 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:14:14.022 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.022 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:14.282 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.282 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:14.542 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:14.542 07:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:14.816 [2024-07-25 07:19:47.325354] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:14.816 [2024-07-25 07:19:47.326602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:14.816 [2024-07-25 07:19:47.326652] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:14.816 [2024-07-25 07:19:47.326687] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:14.816 [2024-07-25 07:19:47.326705] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:14.816 [2024-07-25 07:19:47.326713] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb5d40 name raid_bdev1, state configuring 00:14:14.816 request: 00:14:14.816 { 00:14:14.816 "name": "raid_bdev1", 00:14:14.816 "raid_level": "raid1", 00:14:14.816 "base_bdevs": [ 00:14:14.816 "malloc1", 00:14:14.816 "malloc2" 00:14:14.816 ], 00:14:14.816 "superblock": false, 00:14:14.816 "method": "bdev_raid_create", 00:14:14.816 "req_id": 1 00:14:14.816 } 00:14:14.816 Got JSON-RPC error response 00:14:14.816 response: 00:14:14.816 { 00:14:14.816 "code": -17, 00:14:14.816 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:14.816 } 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:14.816 07:19:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:15.087 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.087 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:14:15.087 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:14:15.087 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:14:15.088 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:15.347 [2024-07-25 07:19:47.786523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:15.347 [2024-07-25 07:19:47.786557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.347 [2024-07-25 07:19:47.786573] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb6100 00:14:15.347 [2024-07-25 07:19:47.786584] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.347 [2024-07-25 07:19:47.787921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.347 [2024-07-25 07:19:47.787946] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:15.347 [2024-07-25 07:19:47.787998] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:15.347 [2024-07-25 07:19:47.788021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:15.347 pt1 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.347 07:19:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:15.606 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.606 "name": "raid_bdev1", 00:14:15.606 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:15.606 "strip_size_kb": 0, 00:14:15.606 "state": "configuring", 00:14:15.606 "raid_level": "raid1", 00:14:15.606 "superblock": true, 00:14:15.606 "num_base_bdevs": 2, 00:14:15.606 "num_base_bdevs_discovered": 1, 00:14:15.606 "num_base_bdevs_operational": 2, 00:14:15.606 "base_bdevs_list": [ 00:14:15.606 { 00:14:15.606 "name": "pt1", 00:14:15.606 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:15.606 "is_configured": true, 00:14:15.606 "data_offset": 2048, 00:14:15.606 "data_size": 63488 00:14:15.606 }, 00:14:15.606 { 00:14:15.606 "name": null, 00:14:15.606 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:15.606 "is_configured": false, 00:14:15.606 "data_offset": 2048, 00:14:15.606 "data_size": 63488 00:14:15.606 } 00:14:15.606 ] 00:14:15.606 }' 00:14:15.606 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.606 07:19:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.174 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:14:16.174 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:14:16.174 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:16.174 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:16.433 [2024-07-25 07:19:48.837291] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:16.433 [2024-07-25 07:19:48.837337] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:16.433 [2024-07-25 07:19:48.837353] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0b4b0 00:14:16.433 [2024-07-25 07:19:48.837364] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:16.433 [2024-07-25 07:19:48.837662] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:16.433 [2024-07-25 07:19:48.837677] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:16.433 [2024-07-25 07:19:48.837730] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:16.433 [2024-07-25 07:19:48.837748] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:16.433 [2024-07-25 07:19:48.837835] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe09bc0 00:14:16.433 [2024-07-25 07:19:48.837845] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:16.433 [2024-07-25 07:19:48.837998] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0bed0 00:14:16.433 [2024-07-25 07:19:48.838114] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe09bc0 00:14:16.433 [2024-07-25 07:19:48.838123] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe09bc0 00:14:16.433 [2024-07-25 07:19:48.838220] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:16.433 pt2 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.433 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.434 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.434 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.434 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.434 07:19:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.693 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.693 "name": "raid_bdev1", 00:14:16.693 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:16.693 "strip_size_kb": 0, 00:14:16.693 "state": "online", 00:14:16.693 "raid_level": "raid1", 00:14:16.693 "superblock": true, 00:14:16.693 "num_base_bdevs": 2, 00:14:16.693 "num_base_bdevs_discovered": 2, 00:14:16.693 "num_base_bdevs_operational": 2, 00:14:16.693 "base_bdevs_list": [ 00:14:16.693 { 00:14:16.693 "name": "pt1", 00:14:16.693 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.693 "is_configured": true, 00:14:16.693 "data_offset": 2048, 00:14:16.693 "data_size": 63488 00:14:16.693 }, 00:14:16.693 { 00:14:16.693 "name": "pt2", 00:14:16.693 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.693 "is_configured": true, 00:14:16.693 "data_offset": 2048, 00:14:16.693 "data_size": 63488 00:14:16.693 } 00:14:16.693 ] 00:14:16.693 }' 00:14:16.693 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.693 07:19:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:17.261 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:17.522 [2024-07-25 07:19:49.884273] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:17.522 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:17.522 "name": "raid_bdev1", 00:14:17.522 "aliases": [ 00:14:17.522 "238a161e-3b83-4bcd-be94-415d12e4aec7" 00:14:17.522 ], 00:14:17.522 "product_name": "Raid Volume", 00:14:17.522 "block_size": 512, 00:14:17.522 "num_blocks": 63488, 00:14:17.522 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:17.522 "assigned_rate_limits": { 00:14:17.522 "rw_ios_per_sec": 0, 00:14:17.522 "rw_mbytes_per_sec": 0, 00:14:17.522 "r_mbytes_per_sec": 0, 00:14:17.522 "w_mbytes_per_sec": 0 00:14:17.522 }, 00:14:17.522 "claimed": false, 00:14:17.522 "zoned": false, 00:14:17.522 "supported_io_types": { 00:14:17.522 "read": true, 00:14:17.522 "write": true, 00:14:17.522 "unmap": false, 00:14:17.522 "flush": false, 00:14:17.522 "reset": true, 00:14:17.522 "nvme_admin": false, 00:14:17.522 "nvme_io": false, 00:14:17.522 "nvme_io_md": false, 00:14:17.522 "write_zeroes": true, 00:14:17.522 "zcopy": false, 00:14:17.522 "get_zone_info": false, 00:14:17.522 "zone_management": false, 00:14:17.522 "zone_append": false, 00:14:17.522 "compare": false, 00:14:17.522 "compare_and_write": false, 00:14:17.522 "abort": false, 00:14:17.522 "seek_hole": false, 00:14:17.522 "seek_data": false, 00:14:17.522 "copy": false, 00:14:17.522 "nvme_iov_md": false 00:14:17.522 }, 00:14:17.522 "memory_domains": [ 00:14:17.522 { 00:14:17.522 "dma_device_id": "system", 00:14:17.522 "dma_device_type": 1 00:14:17.522 }, 00:14:17.522 { 00:14:17.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.522 "dma_device_type": 2 00:14:17.522 }, 00:14:17.522 { 00:14:17.522 "dma_device_id": "system", 00:14:17.522 "dma_device_type": 1 00:14:17.522 }, 00:14:17.522 { 00:14:17.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.522 "dma_device_type": 2 00:14:17.522 } 00:14:17.522 ], 00:14:17.522 "driver_specific": { 00:14:17.522 "raid": { 00:14:17.522 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:17.522 "strip_size_kb": 0, 00:14:17.522 "state": "online", 00:14:17.522 "raid_level": "raid1", 00:14:17.522 "superblock": true, 00:14:17.522 "num_base_bdevs": 2, 00:14:17.522 "num_base_bdevs_discovered": 2, 00:14:17.522 "num_base_bdevs_operational": 2, 00:14:17.522 "base_bdevs_list": [ 00:14:17.522 { 00:14:17.522 "name": "pt1", 00:14:17.522 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.522 "is_configured": true, 00:14:17.522 "data_offset": 2048, 00:14:17.522 "data_size": 63488 00:14:17.522 }, 00:14:17.522 { 00:14:17.522 "name": "pt2", 00:14:17.522 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.522 "is_configured": true, 00:14:17.522 "data_offset": 2048, 00:14:17.522 "data_size": 63488 00:14:17.522 } 00:14:17.522 ] 00:14:17.522 } 00:14:17.522 } 00:14:17.522 }' 00:14:17.522 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:17.522 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:17.522 pt2' 00:14:17.522 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.522 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:17.522 07:19:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.782 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.782 "name": "pt1", 00:14:17.782 "aliases": [ 00:14:17.782 "00000000-0000-0000-0000-000000000001" 00:14:17.782 ], 00:14:17.782 "product_name": "passthru", 00:14:17.782 "block_size": 512, 00:14:17.782 "num_blocks": 65536, 00:14:17.782 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.782 "assigned_rate_limits": { 00:14:17.782 "rw_ios_per_sec": 0, 00:14:17.782 "rw_mbytes_per_sec": 0, 00:14:17.782 "r_mbytes_per_sec": 0, 00:14:17.782 "w_mbytes_per_sec": 0 00:14:17.782 }, 00:14:17.782 "claimed": true, 00:14:17.782 "claim_type": "exclusive_write", 00:14:17.782 "zoned": false, 00:14:17.782 "supported_io_types": { 00:14:17.782 "read": true, 00:14:17.782 "write": true, 00:14:17.782 "unmap": true, 00:14:17.782 "flush": true, 00:14:17.782 "reset": true, 00:14:17.782 "nvme_admin": false, 00:14:17.782 "nvme_io": false, 00:14:17.782 "nvme_io_md": false, 00:14:17.782 "write_zeroes": true, 00:14:17.782 "zcopy": true, 00:14:17.782 "get_zone_info": false, 00:14:17.782 "zone_management": false, 00:14:17.782 "zone_append": false, 00:14:17.782 "compare": false, 00:14:17.782 "compare_and_write": false, 00:14:17.782 "abort": true, 00:14:17.782 "seek_hole": false, 00:14:17.782 "seek_data": false, 00:14:17.782 "copy": true, 00:14:17.782 "nvme_iov_md": false 00:14:17.782 }, 00:14:17.782 "memory_domains": [ 00:14:17.782 { 00:14:17.782 "dma_device_id": "system", 00:14:17.782 "dma_device_type": 1 00:14:17.782 }, 00:14:17.782 { 00:14:17.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.782 "dma_device_type": 2 00:14:17.782 } 00:14:17.782 ], 00:14:17.782 "driver_specific": { 00:14:17.782 "passthru": { 00:14:17.782 "name": "pt1", 00:14:17.782 "base_bdev_name": "malloc1" 00:14:17.782 } 00:14:17.782 } 00:14:17.782 }' 00:14:17.782 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.782 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.782 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.782 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:18.041 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.300 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.300 "name": "pt2", 00:14:18.300 "aliases": [ 00:14:18.300 "00000000-0000-0000-0000-000000000002" 00:14:18.300 ], 00:14:18.300 "product_name": "passthru", 00:14:18.300 "block_size": 512, 00:14:18.300 "num_blocks": 65536, 00:14:18.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:18.300 "assigned_rate_limits": { 00:14:18.300 "rw_ios_per_sec": 0, 00:14:18.300 "rw_mbytes_per_sec": 0, 00:14:18.300 "r_mbytes_per_sec": 0, 00:14:18.300 "w_mbytes_per_sec": 0 00:14:18.300 }, 00:14:18.300 "claimed": true, 00:14:18.300 "claim_type": "exclusive_write", 00:14:18.300 "zoned": false, 00:14:18.300 "supported_io_types": { 00:14:18.300 "read": true, 00:14:18.300 "write": true, 00:14:18.300 "unmap": true, 00:14:18.300 "flush": true, 00:14:18.300 "reset": true, 00:14:18.300 "nvme_admin": false, 00:14:18.300 "nvme_io": false, 00:14:18.300 "nvme_io_md": false, 00:14:18.300 "write_zeroes": true, 00:14:18.300 "zcopy": true, 00:14:18.300 "get_zone_info": false, 00:14:18.300 "zone_management": false, 00:14:18.300 "zone_append": false, 00:14:18.300 "compare": false, 00:14:18.300 "compare_and_write": false, 00:14:18.300 "abort": true, 00:14:18.300 "seek_hole": false, 00:14:18.300 "seek_data": false, 00:14:18.300 "copy": true, 00:14:18.300 "nvme_iov_md": false 00:14:18.300 }, 00:14:18.300 "memory_domains": [ 00:14:18.300 { 00:14:18.300 "dma_device_id": "system", 00:14:18.300 "dma_device_type": 1 00:14:18.300 }, 00:14:18.300 { 00:14:18.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.300 "dma_device_type": 2 00:14:18.300 } 00:14:18.300 ], 00:14:18.300 "driver_specific": { 00:14:18.300 "passthru": { 00:14:18.300 "name": "pt2", 00:14:18.300 "base_bdev_name": "malloc2" 00:14:18.300 } 00:14:18.300 } 00:14:18.300 }' 00:14:18.300 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.300 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.559 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.559 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.559 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.559 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.559 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.559 07:19:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.559 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.559 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.559 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:18.819 [2024-07-25 07:19:51.324051] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 238a161e-3b83-4bcd-be94-415d12e4aec7 '!=' 238a161e-3b83-4bcd-be94-415d12e4aec7 ']' 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:18.819 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:19.078 [2024-07-25 07:19:51.556457] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.078 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:19.338 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.338 "name": "raid_bdev1", 00:14:19.338 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:19.338 "strip_size_kb": 0, 00:14:19.338 "state": "online", 00:14:19.338 "raid_level": "raid1", 00:14:19.338 "superblock": true, 00:14:19.338 "num_base_bdevs": 2, 00:14:19.338 "num_base_bdevs_discovered": 1, 00:14:19.338 "num_base_bdevs_operational": 1, 00:14:19.338 "base_bdevs_list": [ 00:14:19.338 { 00:14:19.338 "name": null, 00:14:19.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.338 "is_configured": false, 00:14:19.338 "data_offset": 2048, 00:14:19.338 "data_size": 63488 00:14:19.338 }, 00:14:19.338 { 00:14:19.338 "name": "pt2", 00:14:19.338 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.338 "is_configured": true, 00:14:19.338 "data_offset": 2048, 00:14:19.338 "data_size": 63488 00:14:19.338 } 00:14:19.338 ] 00:14:19.338 }' 00:14:19.338 07:19:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.338 07:19:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.906 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:20.165 [2024-07-25 07:19:52.579122] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:20.165 [2024-07-25 07:19:52.579151] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:20.165 [2024-07-25 07:19:52.579198] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:20.165 [2024-07-25 07:19:52.579235] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:20.165 [2024-07-25 07:19:52.579245] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe09bc0 name raid_bdev1, state offline 00:14:20.166 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.166 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:14:20.425 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:14:20.425 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:14:20.425 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:14:20.425 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:14:20.425 07:19:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:20.685 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:14:20.685 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:14:20.685 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:14:20.685 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:14:20.685 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=1 00:14:20.685 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:20.945 [2024-07-25 07:19:53.252940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:20.945 [2024-07-25 07:19:53.252983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.945 [2024-07-25 07:19:53.253002] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0b6e0 00:14:20.945 [2024-07-25 07:19:53.253014] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.945 [2024-07-25 07:19:53.254497] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.945 [2024-07-25 07:19:53.254524] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:20.945 [2024-07-25 07:19:53.254583] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:20.945 [2024-07-25 07:19:53.254606] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:20.945 [2024-07-25 07:19:53.254683] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb5770 00:14:20.945 [2024-07-25 07:19:53.254692] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:20.945 [2024-07-25 07:19:53.254848] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0c6e0 00:14:20.945 [2024-07-25 07:19:53.254955] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb5770 00:14:20.945 [2024-07-25 07:19:53.254964] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfb5770 00:14:20.945 [2024-07-25 07:19:53.255052] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:20.945 pt2 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.945 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:21.204 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.204 "name": "raid_bdev1", 00:14:21.204 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:21.204 "strip_size_kb": 0, 00:14:21.204 "state": "online", 00:14:21.204 "raid_level": "raid1", 00:14:21.204 "superblock": true, 00:14:21.204 "num_base_bdevs": 2, 00:14:21.204 "num_base_bdevs_discovered": 1, 00:14:21.204 "num_base_bdevs_operational": 1, 00:14:21.204 "base_bdevs_list": [ 00:14:21.204 { 00:14:21.204 "name": null, 00:14:21.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.204 "is_configured": false, 00:14:21.204 "data_offset": 2048, 00:14:21.204 "data_size": 63488 00:14:21.204 }, 00:14:21.204 { 00:14:21.204 "name": "pt2", 00:14:21.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:21.204 "is_configured": true, 00:14:21.204 "data_offset": 2048, 00:14:21.204 "data_size": 63488 00:14:21.204 } 00:14:21.204 ] 00:14:21.204 }' 00:14:21.204 07:19:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.204 07:19:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.772 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:21.772 [2024-07-25 07:19:54.279642] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:21.772 [2024-07-25 07:19:54.279669] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:21.772 [2024-07-25 07:19:54.279715] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:21.772 [2024-07-25 07:19:54.279755] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:21.772 [2024-07-25 07:19:54.279766] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb5770 name raid_bdev1, state offline 00:14:21.772 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.772 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:14:22.032 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:14:22.032 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:14:22.032 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:14:22.032 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:22.291 [2024-07-25 07:19:54.732823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:22.291 [2024-07-25 07:19:54.732865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.291 [2024-07-25 07:19:54.732882] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb4470 00:14:22.291 [2024-07-25 07:19:54.732893] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.291 [2024-07-25 07:19:54.734386] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.291 [2024-07-25 07:19:54.734413] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:22.291 [2024-07-25 07:19:54.734471] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:22.291 [2024-07-25 07:19:54.734494] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:22.291 [2024-07-25 07:19:54.734583] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:14:22.291 [2024-07-25 07:19:54.734594] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:22.291 [2024-07-25 07:19:54.734607] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbc320 name raid_bdev1, state configuring 00:14:22.291 [2024-07-25 07:19:54.734628] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:22.291 [2024-07-25 07:19:54.734679] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfbba50 00:14:22.291 [2024-07-25 07:19:54.734688] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:22.291 [2024-07-25 07:19:54.734844] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0bed0 00:14:22.291 [2024-07-25 07:19:54.734953] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfbba50 00:14:22.291 [2024-07-25 07:19:54.734962] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfbba50 00:14:22.291 [2024-07-25 07:19:54.735049] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.291 pt1 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.291 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.550 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.550 "name": "raid_bdev1", 00:14:22.550 "uuid": "238a161e-3b83-4bcd-be94-415d12e4aec7", 00:14:22.550 "strip_size_kb": 0, 00:14:22.550 "state": "online", 00:14:22.550 "raid_level": "raid1", 00:14:22.550 "superblock": true, 00:14:22.550 "num_base_bdevs": 2, 00:14:22.550 "num_base_bdevs_discovered": 1, 00:14:22.550 "num_base_bdevs_operational": 1, 00:14:22.550 "base_bdevs_list": [ 00:14:22.550 { 00:14:22.550 "name": null, 00:14:22.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.550 "is_configured": false, 00:14:22.550 "data_offset": 2048, 00:14:22.550 "data_size": 63488 00:14:22.550 }, 00:14:22.550 { 00:14:22.550 "name": "pt2", 00:14:22.550 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:22.550 "is_configured": true, 00:14:22.550 "data_offset": 2048, 00:14:22.550 "data_size": 63488 00:14:22.550 } 00:14:22.550 ] 00:14:22.550 }' 00:14:22.550 07:19:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.550 07:19:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.118 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:23.118 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:23.377 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:14:23.377 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:23.377 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:14:23.636 [2024-07-25 07:19:55.968396] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' 238a161e-3b83-4bcd-be94-415d12e4aec7 '!=' 238a161e-3b83-4bcd-be94-415d12e4aec7 ']' 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1606872 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1606872 ']' 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1606872 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:23.636 07:19:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1606872 00:14:23.637 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:23.637 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:23.637 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1606872' 00:14:23.637 killing process with pid 1606872 00:14:23.637 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1606872 00:14:23.637 [2024-07-25 07:19:56.044549] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:23.637 [2024-07-25 07:19:56.044600] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:23.637 [2024-07-25 07:19:56.044639] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:23.637 [2024-07-25 07:19:56.044649] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbba50 name raid_bdev1, state offline 00:14:23.637 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1606872 00:14:23.637 [2024-07-25 07:19:56.060544] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:23.896 07:19:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:14:23.896 00:14:23.896 real 0m14.699s 00:14:23.896 user 0m26.725s 00:14:23.896 sys 0m2.633s 00:14:23.896 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:23.896 07:19:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.896 ************************************ 00:14:23.896 END TEST raid_superblock_test 00:14:23.896 ************************************ 00:14:23.896 07:19:56 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:14:23.896 07:19:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:23.896 07:19:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:23.896 07:19:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:23.896 ************************************ 00:14:23.896 START TEST raid_read_error_test 00:14:23.896 ************************************ 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 read 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.VPurgT3Nos 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1609746 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1609746 /var/tmp/spdk-raid.sock 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1609746 ']' 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:23.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:23.896 07:19:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.896 [2024-07-25 07:19:56.396160] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:14:23.896 [2024-07-25 07:19:56.396218] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1609746 ] 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:24.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:24.156 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:24.156 [2024-07-25 07:19:56.526008] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:24.156 [2024-07-25 07:19:56.610570] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:14:24.156 [2024-07-25 07:19:56.670193] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:24.156 [2024-07-25 07:19:56.670255] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:24.725 07:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:24.725 07:19:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:24.725 07:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:24.725 07:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:24.984 BaseBdev1_malloc 00:14:24.984 07:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:25.244 true 00:14:25.244 07:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:25.503 [2024-07-25 07:19:57.888166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:25.503 [2024-07-25 07:19:57.888208] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:25.503 [2024-07-25 07:19:57.888227] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2595a50 00:14:25.503 [2024-07-25 07:19:57.888239] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:25.503 [2024-07-25 07:19:57.889731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:25.503 [2024-07-25 07:19:57.889759] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:25.503 BaseBdev1 00:14:25.503 07:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:25.503 07:19:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:25.762 BaseBdev2_malloc 00:14:25.762 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:26.054 true 00:14:26.054 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:26.312 [2024-07-25 07:19:58.570408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:26.312 [2024-07-25 07:19:58.570447] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:26.312 [2024-07-25 07:19:58.570466] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x273ef40 00:14:26.312 [2024-07-25 07:19:58.570478] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:26.312 [2024-07-25 07:19:58.571887] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:26.312 [2024-07-25 07:19:58.571913] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:26.312 BaseBdev2 00:14:26.312 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:26.312 [2024-07-25 07:19:58.795014] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:26.312 [2024-07-25 07:19:58.796243] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:26.312 [2024-07-25 07:19:58.796425] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2741860 00:14:26.312 [2024-07-25 07:19:58.796438] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:26.312 [2024-07-25 07:19:58.796613] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2741280 00:14:26.312 [2024-07-25 07:19:58.796753] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2741860 00:14:26.312 [2024-07-25 07:19:58.796763] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2741860 00:14:26.312 [2024-07-25 07:19:58.796863] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:26.312 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:26.312 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:26.312 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.312 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.313 07:19:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:26.571 07:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.571 "name": "raid_bdev1", 00:14:26.571 "uuid": "3bb4b490-9ed7-4d38-bfd4-9304a0d681ab", 00:14:26.571 "strip_size_kb": 0, 00:14:26.571 "state": "online", 00:14:26.571 "raid_level": "raid1", 00:14:26.571 "superblock": true, 00:14:26.571 "num_base_bdevs": 2, 00:14:26.571 "num_base_bdevs_discovered": 2, 00:14:26.571 "num_base_bdevs_operational": 2, 00:14:26.571 "base_bdevs_list": [ 00:14:26.571 { 00:14:26.571 "name": "BaseBdev1", 00:14:26.571 "uuid": "9a9529ba-e47d-5fe3-b905-ba5d362321df", 00:14:26.571 "is_configured": true, 00:14:26.571 "data_offset": 2048, 00:14:26.571 "data_size": 63488 00:14:26.571 }, 00:14:26.571 { 00:14:26.571 "name": "BaseBdev2", 00:14:26.571 "uuid": "6e251681-34d3-5d57-a827-8c1883ea114b", 00:14:26.571 "is_configured": true, 00:14:26.571 "data_offset": 2048, 00:14:26.571 "data_size": 63488 00:14:26.571 } 00:14:26.571 ] 00:14:26.571 }' 00:14:26.571 07:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.571 07:19:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.137 07:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:14:27.137 07:19:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:27.396 [2024-07-25 07:19:59.697644] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2740d30 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.332 07:20:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:28.590 07:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.590 "name": "raid_bdev1", 00:14:28.590 "uuid": "3bb4b490-9ed7-4d38-bfd4-9304a0d681ab", 00:14:28.590 "strip_size_kb": 0, 00:14:28.590 "state": "online", 00:14:28.590 "raid_level": "raid1", 00:14:28.590 "superblock": true, 00:14:28.590 "num_base_bdevs": 2, 00:14:28.590 "num_base_bdevs_discovered": 2, 00:14:28.590 "num_base_bdevs_operational": 2, 00:14:28.590 "base_bdevs_list": [ 00:14:28.590 { 00:14:28.590 "name": "BaseBdev1", 00:14:28.590 "uuid": "9a9529ba-e47d-5fe3-b905-ba5d362321df", 00:14:28.590 "is_configured": true, 00:14:28.590 "data_offset": 2048, 00:14:28.590 "data_size": 63488 00:14:28.590 }, 00:14:28.590 { 00:14:28.590 "name": "BaseBdev2", 00:14:28.590 "uuid": "6e251681-34d3-5d57-a827-8c1883ea114b", 00:14:28.590 "is_configured": true, 00:14:28.590 "data_offset": 2048, 00:14:28.590 "data_size": 63488 00:14:28.590 } 00:14:28.590 ] 00:14:28.590 }' 00:14:28.590 07:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.590 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.186 07:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:29.445 [2024-07-25 07:20:01.850000] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:29.445 [2024-07-25 07:20:01.850042] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:29.445 [2024-07-25 07:20:01.852911] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:29.445 [2024-07-25 07:20:01.852940] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:29.445 [2024-07-25 07:20:01.853010] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:29.445 [2024-07-25 07:20:01.853022] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2741860 name raid_bdev1, state offline 00:14:29.445 0 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1609746 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1609746 ']' 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1609746 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1609746 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1609746' 00:14:29.445 killing process with pid 1609746 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1609746 00:14:29.445 [2024-07-25 07:20:01.929303] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:29.445 07:20:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1609746 00:14:29.445 [2024-07-25 07:20:01.939145] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.VPurgT3Nos 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:29.705 00:14:29.705 real 0m5.825s 00:14:29.705 user 0m9.022s 00:14:29.705 sys 0m1.032s 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:29.705 07:20:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.705 ************************************ 00:14:29.705 END TEST raid_read_error_test 00:14:29.705 ************************************ 00:14:29.705 07:20:02 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:14:29.705 07:20:02 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:29.705 07:20:02 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:29.705 07:20:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:29.705 ************************************ 00:14:29.705 START TEST raid_write_error_test 00:14:29.705 ************************************ 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 write 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:29.705 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.FWoATOLEyl 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1610743 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1610743 /var/tmp/spdk-raid.sock 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1610743 ']' 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:29.964 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:29.964 07:20:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.964 [2024-07-25 07:20:02.307821] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:14:29.964 [2024-07-25 07:20:02.307882] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610743 ] 00:14:29.964 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.964 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:29.964 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.964 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:29.964 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.964 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:29.964 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.964 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:29.964 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.964 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:29.964 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.964 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:29.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:29.965 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:29.965 [2024-07-25 07:20:02.442063] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:30.224 [2024-07-25 07:20:02.529109] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:14:30.224 [2024-07-25 07:20:02.592516] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.224 [2024-07-25 07:20:02.592575] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.792 07:20:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:30.792 07:20:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:30.792 07:20:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:30.792 07:20:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:31.051 BaseBdev1_malloc 00:14:31.051 07:20:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:31.310 true 00:14:31.310 07:20:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:31.568 [2024-07-25 07:20:03.861264] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:31.568 [2024-07-25 07:20:03.861305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.568 [2024-07-25 07:20:03.861323] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x100ca50 00:14:31.568 [2024-07-25 07:20:03.861334] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.568 [2024-07-25 07:20:03.862714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.568 [2024-07-25 07:20:03.862742] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:31.568 BaseBdev1 00:14:31.568 07:20:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:31.568 07:20:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:31.828 BaseBdev2_malloc 00:14:31.828 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:31.828 true 00:14:31.828 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:32.087 [2024-07-25 07:20:04.547499] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:32.087 [2024-07-25 07:20:04.547539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:32.087 [2024-07-25 07:20:04.547555] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b5f40 00:14:32.087 [2024-07-25 07:20:04.547566] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:32.087 [2024-07-25 07:20:04.548848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:32.087 [2024-07-25 07:20:04.548875] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:32.087 BaseBdev2 00:14:32.087 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:32.346 [2024-07-25 07:20:04.772118] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:32.346 [2024-07-25 07:20:04.773187] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:32.346 [2024-07-25 07:20:04.773358] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11b8860 00:14:32.346 [2024-07-25 07:20:04.773370] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:32.346 [2024-07-25 07:20:04.773528] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11b8280 00:14:32.346 [2024-07-25 07:20:04.773660] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11b8860 00:14:32.346 [2024-07-25 07:20:04.773670] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11b8860 00:14:32.346 [2024-07-25 07:20:04.773760] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.346 07:20:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:32.606 07:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.606 "name": "raid_bdev1", 00:14:32.606 "uuid": "a4178519-3082-49a9-8350-28c225a6ff60", 00:14:32.606 "strip_size_kb": 0, 00:14:32.606 "state": "online", 00:14:32.606 "raid_level": "raid1", 00:14:32.606 "superblock": true, 00:14:32.606 "num_base_bdevs": 2, 00:14:32.606 "num_base_bdevs_discovered": 2, 00:14:32.606 "num_base_bdevs_operational": 2, 00:14:32.606 "base_bdevs_list": [ 00:14:32.606 { 00:14:32.606 "name": "BaseBdev1", 00:14:32.606 "uuid": "850a4fb5-8029-50c8-a87d-52efba14b639", 00:14:32.606 "is_configured": true, 00:14:32.606 "data_offset": 2048, 00:14:32.606 "data_size": 63488 00:14:32.606 }, 00:14:32.606 { 00:14:32.606 "name": "BaseBdev2", 00:14:32.606 "uuid": "2bfe9261-fd31-52e5-8c66-7695db8efcf8", 00:14:32.606 "is_configured": true, 00:14:32.606 "data_offset": 2048, 00:14:32.606 "data_size": 63488 00:14:32.606 } 00:14:32.606 ] 00:14:32.606 }' 00:14:32.606 07:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.606 07:20:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.175 07:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:14:33.175 07:20:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:33.175 [2024-07-25 07:20:05.678746] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11b7d30 00:14:34.112 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:34.372 [2024-07-25 07:20:06.789025] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:34.372 [2024-07-25 07:20:06.789089] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:34.372 [2024-07-25 07:20:06.789270] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x11b7d30 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=1 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.372 07:20:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:34.631 07:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.631 "name": "raid_bdev1", 00:14:34.631 "uuid": "a4178519-3082-49a9-8350-28c225a6ff60", 00:14:34.631 "strip_size_kb": 0, 00:14:34.631 "state": "online", 00:14:34.631 "raid_level": "raid1", 00:14:34.631 "superblock": true, 00:14:34.631 "num_base_bdevs": 2, 00:14:34.631 "num_base_bdevs_discovered": 1, 00:14:34.631 "num_base_bdevs_operational": 1, 00:14:34.631 "base_bdevs_list": [ 00:14:34.631 { 00:14:34.631 "name": null, 00:14:34.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.631 "is_configured": false, 00:14:34.631 "data_offset": 2048, 00:14:34.631 "data_size": 63488 00:14:34.631 }, 00:14:34.631 { 00:14:34.631 "name": "BaseBdev2", 00:14:34.631 "uuid": "2bfe9261-fd31-52e5-8c66-7695db8efcf8", 00:14:34.631 "is_configured": true, 00:14:34.631 "data_offset": 2048, 00:14:34.631 "data_size": 63488 00:14:34.631 } 00:14:34.631 ] 00:14:34.631 }' 00:14:34.631 07:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.631 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.198 07:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:35.457 [2024-07-25 07:20:07.851987] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:35.457 [2024-07-25 07:20:07.852018] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:35.457 [2024-07-25 07:20:07.854886] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:35.457 [2024-07-25 07:20:07.854910] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:35.457 [2024-07-25 07:20:07.854956] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:35.457 [2024-07-25 07:20:07.854967] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b8860 name raid_bdev1, state offline 00:14:35.457 0 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1610743 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1610743 ']' 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1610743 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1610743 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1610743' 00:14:35.457 killing process with pid 1610743 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1610743 00:14:35.457 [2024-07-25 07:20:07.928341] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:35.457 07:20:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1610743 00:14:35.457 [2024-07-25 07:20:07.937743] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:35.716 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.FWoATOLEyl 00:14:35.716 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:14:35.716 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:14:35.716 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:14:35.717 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:14:35.717 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:35.717 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:35.717 07:20:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:35.717 00:14:35.717 real 0m5.915s 00:14:35.717 user 0m9.172s 00:14:35.717 sys 0m1.058s 00:14:35.717 07:20:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:35.717 07:20:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.717 ************************************ 00:14:35.717 END TEST raid_write_error_test 00:14:35.717 ************************************ 00:14:35.717 07:20:08 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:14:35.717 07:20:08 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:14:35.717 07:20:08 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:14:35.717 07:20:08 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:35.717 07:20:08 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:35.717 07:20:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:35.717 ************************************ 00:14:35.717 START TEST raid_state_function_test 00:14:35.717 ************************************ 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 false 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1611883 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1611883' 00:14:35.717 Process raid pid: 1611883 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1611883 /var/tmp/spdk-raid.sock 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1611883 ']' 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:35.717 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:35.717 07:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.977 [2024-07-25 07:20:08.300942] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:14:35.977 [2024-07-25 07:20:08.301000] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:35.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:35.977 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:35.977 [2024-07-25 07:20:08.434668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:36.237 [2024-07-25 07:20:08.517759] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:14:36.237 [2024-07-25 07:20:08.578933] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:36.237 [2024-07-25 07:20:08.578969] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:36.805 07:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:36.805 07:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:14:36.805 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:37.064 [2024-07-25 07:20:09.354494] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:37.064 [2024-07-25 07:20:09.354533] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:37.064 [2024-07-25 07:20:09.354543] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:37.064 [2024-07-25 07:20:09.354554] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:37.064 [2024-07-25 07:20:09.354562] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:37.064 [2024-07-25 07:20:09.354572] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.064 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.324 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.324 "name": "Existed_Raid", 00:14:37.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.324 "strip_size_kb": 64, 00:14:37.324 "state": "configuring", 00:14:37.324 "raid_level": "raid0", 00:14:37.324 "superblock": false, 00:14:37.324 "num_base_bdevs": 3, 00:14:37.324 "num_base_bdevs_discovered": 0, 00:14:37.324 "num_base_bdevs_operational": 3, 00:14:37.324 "base_bdevs_list": [ 00:14:37.324 { 00:14:37.324 "name": "BaseBdev1", 00:14:37.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.324 "is_configured": false, 00:14:37.324 "data_offset": 0, 00:14:37.324 "data_size": 0 00:14:37.324 }, 00:14:37.324 { 00:14:37.324 "name": "BaseBdev2", 00:14:37.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.324 "is_configured": false, 00:14:37.324 "data_offset": 0, 00:14:37.324 "data_size": 0 00:14:37.324 }, 00:14:37.324 { 00:14:37.324 "name": "BaseBdev3", 00:14:37.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.324 "is_configured": false, 00:14:37.324 "data_offset": 0, 00:14:37.324 "data_size": 0 00:14:37.324 } 00:14:37.324 ] 00:14:37.324 }' 00:14:37.324 07:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.324 07:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.892 07:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:37.892 [2024-07-25 07:20:10.397123] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:37.892 [2024-07-25 07:20:10.397160] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2039ec0 name Existed_Raid, state configuring 00:14:37.892 07:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:38.151 [2024-07-25 07:20:10.625731] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:38.151 [2024-07-25 07:20:10.625757] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:38.151 [2024-07-25 07:20:10.625766] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:38.151 [2024-07-25 07:20:10.625776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:38.151 [2024-07-25 07:20:10.625784] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:38.151 [2024-07-25 07:20:10.625794] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:38.151 07:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:38.410 [2024-07-25 07:20:10.863750] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:38.410 BaseBdev1 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:38.410 07:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.669 07:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:38.928 [ 00:14:38.928 { 00:14:38.928 "name": "BaseBdev1", 00:14:38.928 "aliases": [ 00:14:38.928 "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0" 00:14:38.928 ], 00:14:38.928 "product_name": "Malloc disk", 00:14:38.928 "block_size": 512, 00:14:38.928 "num_blocks": 65536, 00:14:38.928 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:38.928 "assigned_rate_limits": { 00:14:38.928 "rw_ios_per_sec": 0, 00:14:38.928 "rw_mbytes_per_sec": 0, 00:14:38.928 "r_mbytes_per_sec": 0, 00:14:38.928 "w_mbytes_per_sec": 0 00:14:38.928 }, 00:14:38.928 "claimed": true, 00:14:38.928 "claim_type": "exclusive_write", 00:14:38.928 "zoned": false, 00:14:38.928 "supported_io_types": { 00:14:38.928 "read": true, 00:14:38.928 "write": true, 00:14:38.928 "unmap": true, 00:14:38.928 "flush": true, 00:14:38.928 "reset": true, 00:14:38.928 "nvme_admin": false, 00:14:38.928 "nvme_io": false, 00:14:38.928 "nvme_io_md": false, 00:14:38.928 "write_zeroes": true, 00:14:38.928 "zcopy": true, 00:14:38.928 "get_zone_info": false, 00:14:38.928 "zone_management": false, 00:14:38.928 "zone_append": false, 00:14:38.928 "compare": false, 00:14:38.928 "compare_and_write": false, 00:14:38.928 "abort": true, 00:14:38.928 "seek_hole": false, 00:14:38.928 "seek_data": false, 00:14:38.928 "copy": true, 00:14:38.928 "nvme_iov_md": false 00:14:38.928 }, 00:14:38.928 "memory_domains": [ 00:14:38.928 { 00:14:38.928 "dma_device_id": "system", 00:14:38.928 "dma_device_type": 1 00:14:38.928 }, 00:14:38.928 { 00:14:38.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.928 "dma_device_type": 2 00:14:38.928 } 00:14:38.928 ], 00:14:38.928 "driver_specific": {} 00:14:38.928 } 00:14:38.928 ] 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.928 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.187 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.187 "name": "Existed_Raid", 00:14:39.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.187 "strip_size_kb": 64, 00:14:39.187 "state": "configuring", 00:14:39.187 "raid_level": "raid0", 00:14:39.187 "superblock": false, 00:14:39.187 "num_base_bdevs": 3, 00:14:39.187 "num_base_bdevs_discovered": 1, 00:14:39.187 "num_base_bdevs_operational": 3, 00:14:39.187 "base_bdevs_list": [ 00:14:39.187 { 00:14:39.187 "name": "BaseBdev1", 00:14:39.187 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:39.187 "is_configured": true, 00:14:39.187 "data_offset": 0, 00:14:39.187 "data_size": 65536 00:14:39.187 }, 00:14:39.187 { 00:14:39.187 "name": "BaseBdev2", 00:14:39.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.187 "is_configured": false, 00:14:39.187 "data_offset": 0, 00:14:39.187 "data_size": 0 00:14:39.187 }, 00:14:39.187 { 00:14:39.187 "name": "BaseBdev3", 00:14:39.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.187 "is_configured": false, 00:14:39.187 "data_offset": 0, 00:14:39.187 "data_size": 0 00:14:39.187 } 00:14:39.187 ] 00:14:39.187 }' 00:14:39.187 07:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.187 07:20:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.754 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:40.012 [2024-07-25 07:20:12.355681] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:40.012 [2024-07-25 07:20:12.355722] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2039790 name Existed_Raid, state configuring 00:14:40.012 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:40.270 [2024-07-25 07:20:12.584309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:40.270 [2024-07-25 07:20:12.585680] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:40.271 [2024-07-25 07:20:12.585711] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:40.271 [2024-07-25 07:20:12.585721] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:40.271 [2024-07-25 07:20:12.585732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.271 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.530 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.530 "name": "Existed_Raid", 00:14:40.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.530 "strip_size_kb": 64, 00:14:40.530 "state": "configuring", 00:14:40.530 "raid_level": "raid0", 00:14:40.530 "superblock": false, 00:14:40.530 "num_base_bdevs": 3, 00:14:40.530 "num_base_bdevs_discovered": 1, 00:14:40.530 "num_base_bdevs_operational": 3, 00:14:40.530 "base_bdevs_list": [ 00:14:40.530 { 00:14:40.530 "name": "BaseBdev1", 00:14:40.530 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:40.530 "is_configured": true, 00:14:40.530 "data_offset": 0, 00:14:40.530 "data_size": 65536 00:14:40.530 }, 00:14:40.530 { 00:14:40.530 "name": "BaseBdev2", 00:14:40.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.530 "is_configured": false, 00:14:40.530 "data_offset": 0, 00:14:40.530 "data_size": 0 00:14:40.530 }, 00:14:40.530 { 00:14:40.530 "name": "BaseBdev3", 00:14:40.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.530 "is_configured": false, 00:14:40.530 "data_offset": 0, 00:14:40.530 "data_size": 0 00:14:40.530 } 00:14:40.530 ] 00:14:40.530 }' 00:14:40.530 07:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.530 07:20:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.097 07:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:41.097 [2024-07-25 07:20:13.546128] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:41.097 BaseBdev2 00:14:41.097 07:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:41.098 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:41.098 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:41.098 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:41.098 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:41.098 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:41.098 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:41.356 07:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:41.658 [ 00:14:41.658 { 00:14:41.658 "name": "BaseBdev2", 00:14:41.658 "aliases": [ 00:14:41.658 "aa3ff959-5bf6-452d-a13e-0cce727e058c" 00:14:41.658 ], 00:14:41.658 "product_name": "Malloc disk", 00:14:41.658 "block_size": 512, 00:14:41.658 "num_blocks": 65536, 00:14:41.658 "uuid": "aa3ff959-5bf6-452d-a13e-0cce727e058c", 00:14:41.658 "assigned_rate_limits": { 00:14:41.658 "rw_ios_per_sec": 0, 00:14:41.658 "rw_mbytes_per_sec": 0, 00:14:41.658 "r_mbytes_per_sec": 0, 00:14:41.658 "w_mbytes_per_sec": 0 00:14:41.658 }, 00:14:41.658 "claimed": true, 00:14:41.658 "claim_type": "exclusive_write", 00:14:41.658 "zoned": false, 00:14:41.658 "supported_io_types": { 00:14:41.658 "read": true, 00:14:41.658 "write": true, 00:14:41.658 "unmap": true, 00:14:41.658 "flush": true, 00:14:41.658 "reset": true, 00:14:41.658 "nvme_admin": false, 00:14:41.658 "nvme_io": false, 00:14:41.658 "nvme_io_md": false, 00:14:41.658 "write_zeroes": true, 00:14:41.658 "zcopy": true, 00:14:41.658 "get_zone_info": false, 00:14:41.658 "zone_management": false, 00:14:41.658 "zone_append": false, 00:14:41.658 "compare": false, 00:14:41.658 "compare_and_write": false, 00:14:41.658 "abort": true, 00:14:41.658 "seek_hole": false, 00:14:41.658 "seek_data": false, 00:14:41.658 "copy": true, 00:14:41.658 "nvme_iov_md": false 00:14:41.658 }, 00:14:41.658 "memory_domains": [ 00:14:41.658 { 00:14:41.658 "dma_device_id": "system", 00:14:41.658 "dma_device_type": 1 00:14:41.658 }, 00:14:41.658 { 00:14:41.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.658 "dma_device_type": 2 00:14:41.658 } 00:14:41.658 ], 00:14:41.658 "driver_specific": {} 00:14:41.658 } 00:14:41.658 ] 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.658 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.926 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.926 "name": "Existed_Raid", 00:14:41.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.926 "strip_size_kb": 64, 00:14:41.926 "state": "configuring", 00:14:41.926 "raid_level": "raid0", 00:14:41.926 "superblock": false, 00:14:41.926 "num_base_bdevs": 3, 00:14:41.926 "num_base_bdevs_discovered": 2, 00:14:41.926 "num_base_bdevs_operational": 3, 00:14:41.926 "base_bdevs_list": [ 00:14:41.926 { 00:14:41.926 "name": "BaseBdev1", 00:14:41.926 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:41.926 "is_configured": true, 00:14:41.926 "data_offset": 0, 00:14:41.926 "data_size": 65536 00:14:41.926 }, 00:14:41.926 { 00:14:41.926 "name": "BaseBdev2", 00:14:41.926 "uuid": "aa3ff959-5bf6-452d-a13e-0cce727e058c", 00:14:41.926 "is_configured": true, 00:14:41.926 "data_offset": 0, 00:14:41.926 "data_size": 65536 00:14:41.926 }, 00:14:41.926 { 00:14:41.926 "name": "BaseBdev3", 00:14:41.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.926 "is_configured": false, 00:14:41.926 "data_offset": 0, 00:14:41.926 "data_size": 0 00:14:41.926 } 00:14:41.926 ] 00:14:41.926 }' 00:14:41.926 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.926 07:20:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.494 07:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:42.494 [2024-07-25 07:20:15.021241] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:42.494 [2024-07-25 07:20:15.021279] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x203a680 00:14:42.494 [2024-07-25 07:20:15.021287] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:42.494 [2024-07-25 07:20:15.021469] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x203c0b0 00:14:42.494 [2024-07-25 07:20:15.021585] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x203a680 00:14:42.494 [2024-07-25 07:20:15.021595] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x203a680 00:14:42.494 [2024-07-25 07:20:15.021749] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:42.495 BaseBdev3 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.754 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:43.013 [ 00:14:43.013 { 00:14:43.013 "name": "BaseBdev3", 00:14:43.013 "aliases": [ 00:14:43.013 "6210dd23-8258-420f-b6af-db62d983d70c" 00:14:43.013 ], 00:14:43.013 "product_name": "Malloc disk", 00:14:43.013 "block_size": 512, 00:14:43.013 "num_blocks": 65536, 00:14:43.013 "uuid": "6210dd23-8258-420f-b6af-db62d983d70c", 00:14:43.013 "assigned_rate_limits": { 00:14:43.013 "rw_ios_per_sec": 0, 00:14:43.013 "rw_mbytes_per_sec": 0, 00:14:43.013 "r_mbytes_per_sec": 0, 00:14:43.013 "w_mbytes_per_sec": 0 00:14:43.013 }, 00:14:43.013 "claimed": true, 00:14:43.013 "claim_type": "exclusive_write", 00:14:43.013 "zoned": false, 00:14:43.013 "supported_io_types": { 00:14:43.013 "read": true, 00:14:43.013 "write": true, 00:14:43.013 "unmap": true, 00:14:43.013 "flush": true, 00:14:43.013 "reset": true, 00:14:43.013 "nvme_admin": false, 00:14:43.013 "nvme_io": false, 00:14:43.013 "nvme_io_md": false, 00:14:43.013 "write_zeroes": true, 00:14:43.013 "zcopy": true, 00:14:43.013 "get_zone_info": false, 00:14:43.013 "zone_management": false, 00:14:43.013 "zone_append": false, 00:14:43.013 "compare": false, 00:14:43.013 "compare_and_write": false, 00:14:43.013 "abort": true, 00:14:43.013 "seek_hole": false, 00:14:43.013 "seek_data": false, 00:14:43.013 "copy": true, 00:14:43.013 "nvme_iov_md": false 00:14:43.013 }, 00:14:43.013 "memory_domains": [ 00:14:43.013 { 00:14:43.013 "dma_device_id": "system", 00:14:43.013 "dma_device_type": 1 00:14:43.013 }, 00:14:43.013 { 00:14:43.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.013 "dma_device_type": 2 00:14:43.013 } 00:14:43.013 ], 00:14:43.013 "driver_specific": {} 00:14:43.013 } 00:14:43.013 ] 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.013 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.273 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.273 "name": "Existed_Raid", 00:14:43.273 "uuid": "50a71ec3-0dea-4895-a101-610854905f3d", 00:14:43.273 "strip_size_kb": 64, 00:14:43.273 "state": "online", 00:14:43.273 "raid_level": "raid0", 00:14:43.273 "superblock": false, 00:14:43.273 "num_base_bdevs": 3, 00:14:43.273 "num_base_bdevs_discovered": 3, 00:14:43.273 "num_base_bdevs_operational": 3, 00:14:43.273 "base_bdevs_list": [ 00:14:43.273 { 00:14:43.273 "name": "BaseBdev1", 00:14:43.273 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:43.273 "is_configured": true, 00:14:43.273 "data_offset": 0, 00:14:43.273 "data_size": 65536 00:14:43.273 }, 00:14:43.273 { 00:14:43.273 "name": "BaseBdev2", 00:14:43.273 "uuid": "aa3ff959-5bf6-452d-a13e-0cce727e058c", 00:14:43.273 "is_configured": true, 00:14:43.273 "data_offset": 0, 00:14:43.273 "data_size": 65536 00:14:43.273 }, 00:14:43.273 { 00:14:43.273 "name": "BaseBdev3", 00:14:43.273 "uuid": "6210dd23-8258-420f-b6af-db62d983d70c", 00:14:43.273 "is_configured": true, 00:14:43.273 "data_offset": 0, 00:14:43.273 "data_size": 65536 00:14:43.273 } 00:14:43.273 ] 00:14:43.273 }' 00:14:43.273 07:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.273 07:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:43.842 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:44.101 [2024-07-25 07:20:16.513450] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:44.101 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:44.101 "name": "Existed_Raid", 00:14:44.101 "aliases": [ 00:14:44.101 "50a71ec3-0dea-4895-a101-610854905f3d" 00:14:44.101 ], 00:14:44.101 "product_name": "Raid Volume", 00:14:44.101 "block_size": 512, 00:14:44.101 "num_blocks": 196608, 00:14:44.101 "uuid": "50a71ec3-0dea-4895-a101-610854905f3d", 00:14:44.101 "assigned_rate_limits": { 00:14:44.101 "rw_ios_per_sec": 0, 00:14:44.101 "rw_mbytes_per_sec": 0, 00:14:44.101 "r_mbytes_per_sec": 0, 00:14:44.101 "w_mbytes_per_sec": 0 00:14:44.101 }, 00:14:44.101 "claimed": false, 00:14:44.101 "zoned": false, 00:14:44.101 "supported_io_types": { 00:14:44.101 "read": true, 00:14:44.101 "write": true, 00:14:44.101 "unmap": true, 00:14:44.101 "flush": true, 00:14:44.101 "reset": true, 00:14:44.101 "nvme_admin": false, 00:14:44.101 "nvme_io": false, 00:14:44.101 "nvme_io_md": false, 00:14:44.101 "write_zeroes": true, 00:14:44.101 "zcopy": false, 00:14:44.101 "get_zone_info": false, 00:14:44.101 "zone_management": false, 00:14:44.101 "zone_append": false, 00:14:44.101 "compare": false, 00:14:44.101 "compare_and_write": false, 00:14:44.101 "abort": false, 00:14:44.101 "seek_hole": false, 00:14:44.101 "seek_data": false, 00:14:44.101 "copy": false, 00:14:44.101 "nvme_iov_md": false 00:14:44.101 }, 00:14:44.101 "memory_domains": [ 00:14:44.101 { 00:14:44.101 "dma_device_id": "system", 00:14:44.101 "dma_device_type": 1 00:14:44.101 }, 00:14:44.101 { 00:14:44.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.101 "dma_device_type": 2 00:14:44.101 }, 00:14:44.101 { 00:14:44.101 "dma_device_id": "system", 00:14:44.101 "dma_device_type": 1 00:14:44.102 }, 00:14:44.102 { 00:14:44.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.102 "dma_device_type": 2 00:14:44.102 }, 00:14:44.102 { 00:14:44.102 "dma_device_id": "system", 00:14:44.102 "dma_device_type": 1 00:14:44.102 }, 00:14:44.102 { 00:14:44.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.102 "dma_device_type": 2 00:14:44.102 } 00:14:44.102 ], 00:14:44.102 "driver_specific": { 00:14:44.102 "raid": { 00:14:44.102 "uuid": "50a71ec3-0dea-4895-a101-610854905f3d", 00:14:44.102 "strip_size_kb": 64, 00:14:44.102 "state": "online", 00:14:44.102 "raid_level": "raid0", 00:14:44.102 "superblock": false, 00:14:44.102 "num_base_bdevs": 3, 00:14:44.102 "num_base_bdevs_discovered": 3, 00:14:44.102 "num_base_bdevs_operational": 3, 00:14:44.102 "base_bdevs_list": [ 00:14:44.102 { 00:14:44.102 "name": "BaseBdev1", 00:14:44.102 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:44.102 "is_configured": true, 00:14:44.102 "data_offset": 0, 00:14:44.102 "data_size": 65536 00:14:44.102 }, 00:14:44.102 { 00:14:44.102 "name": "BaseBdev2", 00:14:44.102 "uuid": "aa3ff959-5bf6-452d-a13e-0cce727e058c", 00:14:44.102 "is_configured": true, 00:14:44.102 "data_offset": 0, 00:14:44.102 "data_size": 65536 00:14:44.102 }, 00:14:44.102 { 00:14:44.102 "name": "BaseBdev3", 00:14:44.102 "uuid": "6210dd23-8258-420f-b6af-db62d983d70c", 00:14:44.102 "is_configured": true, 00:14:44.102 "data_offset": 0, 00:14:44.102 "data_size": 65536 00:14:44.102 } 00:14:44.102 ] 00:14:44.102 } 00:14:44.102 } 00:14:44.102 }' 00:14:44.102 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:44.102 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:44.102 BaseBdev2 00:14:44.102 BaseBdev3' 00:14:44.102 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.102 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:44.102 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.361 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.361 "name": "BaseBdev1", 00:14:44.361 "aliases": [ 00:14:44.361 "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0" 00:14:44.361 ], 00:14:44.361 "product_name": "Malloc disk", 00:14:44.362 "block_size": 512, 00:14:44.362 "num_blocks": 65536, 00:14:44.362 "uuid": "b6a6688f-07c3-4b36-b9b9-ca042c1f5ab0", 00:14:44.362 "assigned_rate_limits": { 00:14:44.362 "rw_ios_per_sec": 0, 00:14:44.362 "rw_mbytes_per_sec": 0, 00:14:44.362 "r_mbytes_per_sec": 0, 00:14:44.362 "w_mbytes_per_sec": 0 00:14:44.362 }, 00:14:44.362 "claimed": true, 00:14:44.362 "claim_type": "exclusive_write", 00:14:44.362 "zoned": false, 00:14:44.362 "supported_io_types": { 00:14:44.362 "read": true, 00:14:44.362 "write": true, 00:14:44.362 "unmap": true, 00:14:44.362 "flush": true, 00:14:44.362 "reset": true, 00:14:44.362 "nvme_admin": false, 00:14:44.362 "nvme_io": false, 00:14:44.362 "nvme_io_md": false, 00:14:44.362 "write_zeroes": true, 00:14:44.362 "zcopy": true, 00:14:44.362 "get_zone_info": false, 00:14:44.362 "zone_management": false, 00:14:44.362 "zone_append": false, 00:14:44.362 "compare": false, 00:14:44.362 "compare_and_write": false, 00:14:44.362 "abort": true, 00:14:44.362 "seek_hole": false, 00:14:44.362 "seek_data": false, 00:14:44.362 "copy": true, 00:14:44.362 "nvme_iov_md": false 00:14:44.362 }, 00:14:44.362 "memory_domains": [ 00:14:44.362 { 00:14:44.362 "dma_device_id": "system", 00:14:44.362 "dma_device_type": 1 00:14:44.362 }, 00:14:44.362 { 00:14:44.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.362 "dma_device_type": 2 00:14:44.362 } 00:14:44.362 ], 00:14:44.362 "driver_specific": {} 00:14:44.362 }' 00:14:44.362 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.362 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.362 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.362 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.621 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.621 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.621 07:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.621 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.621 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.621 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.621 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.621 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:44.621 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.880 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.880 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:44.880 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.880 "name": "BaseBdev2", 00:14:44.880 "aliases": [ 00:14:44.880 "aa3ff959-5bf6-452d-a13e-0cce727e058c" 00:14:44.880 ], 00:14:44.880 "product_name": "Malloc disk", 00:14:44.880 "block_size": 512, 00:14:44.880 "num_blocks": 65536, 00:14:44.880 "uuid": "aa3ff959-5bf6-452d-a13e-0cce727e058c", 00:14:44.880 "assigned_rate_limits": { 00:14:44.880 "rw_ios_per_sec": 0, 00:14:44.880 "rw_mbytes_per_sec": 0, 00:14:44.880 "r_mbytes_per_sec": 0, 00:14:44.880 "w_mbytes_per_sec": 0 00:14:44.880 }, 00:14:44.880 "claimed": true, 00:14:44.880 "claim_type": "exclusive_write", 00:14:44.880 "zoned": false, 00:14:44.880 "supported_io_types": { 00:14:44.880 "read": true, 00:14:44.880 "write": true, 00:14:44.880 "unmap": true, 00:14:44.880 "flush": true, 00:14:44.880 "reset": true, 00:14:44.880 "nvme_admin": false, 00:14:44.880 "nvme_io": false, 00:14:44.880 "nvme_io_md": false, 00:14:44.880 "write_zeroes": true, 00:14:44.880 "zcopy": true, 00:14:44.880 "get_zone_info": false, 00:14:44.880 "zone_management": false, 00:14:44.880 "zone_append": false, 00:14:44.880 "compare": false, 00:14:44.881 "compare_and_write": false, 00:14:44.881 "abort": true, 00:14:44.881 "seek_hole": false, 00:14:44.881 "seek_data": false, 00:14:44.881 "copy": true, 00:14:44.881 "nvme_iov_md": false 00:14:44.881 }, 00:14:44.881 "memory_domains": [ 00:14:44.881 { 00:14:44.881 "dma_device_id": "system", 00:14:44.881 "dma_device_type": 1 00:14:44.881 }, 00:14:44.881 { 00:14:44.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.881 "dma_device_type": 2 00:14:44.881 } 00:14:44.881 ], 00:14:44.881 "driver_specific": {} 00:14:44.881 }' 00:14:44.881 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.140 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.399 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.399 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.399 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.399 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:45.399 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.659 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.659 "name": "BaseBdev3", 00:14:45.659 "aliases": [ 00:14:45.659 "6210dd23-8258-420f-b6af-db62d983d70c" 00:14:45.659 ], 00:14:45.659 "product_name": "Malloc disk", 00:14:45.659 "block_size": 512, 00:14:45.659 "num_blocks": 65536, 00:14:45.659 "uuid": "6210dd23-8258-420f-b6af-db62d983d70c", 00:14:45.659 "assigned_rate_limits": { 00:14:45.659 "rw_ios_per_sec": 0, 00:14:45.659 "rw_mbytes_per_sec": 0, 00:14:45.659 "r_mbytes_per_sec": 0, 00:14:45.659 "w_mbytes_per_sec": 0 00:14:45.659 }, 00:14:45.659 "claimed": true, 00:14:45.659 "claim_type": "exclusive_write", 00:14:45.659 "zoned": false, 00:14:45.659 "supported_io_types": { 00:14:45.659 "read": true, 00:14:45.659 "write": true, 00:14:45.659 "unmap": true, 00:14:45.659 "flush": true, 00:14:45.659 "reset": true, 00:14:45.659 "nvme_admin": false, 00:14:45.659 "nvme_io": false, 00:14:45.659 "nvme_io_md": false, 00:14:45.659 "write_zeroes": true, 00:14:45.659 "zcopy": true, 00:14:45.659 "get_zone_info": false, 00:14:45.659 "zone_management": false, 00:14:45.659 "zone_append": false, 00:14:45.659 "compare": false, 00:14:45.659 "compare_and_write": false, 00:14:45.659 "abort": true, 00:14:45.659 "seek_hole": false, 00:14:45.659 "seek_data": false, 00:14:45.659 "copy": true, 00:14:45.659 "nvme_iov_md": false 00:14:45.659 }, 00:14:45.659 "memory_domains": [ 00:14:45.659 { 00:14:45.659 "dma_device_id": "system", 00:14:45.659 "dma_device_type": 1 00:14:45.659 }, 00:14:45.659 { 00:14:45.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.659 "dma_device_type": 2 00:14:45.659 } 00:14:45.659 ], 00:14:45.659 "driver_specific": {} 00:14:45.659 }' 00:14:45.659 07:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.659 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.919 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.919 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.919 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.919 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.919 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:46.178 [2024-07-25 07:20:18.522616] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:46.178 [2024-07-25 07:20:18.522643] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:46.178 [2024-07-25 07:20:18.522681] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.178 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.438 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.438 "name": "Existed_Raid", 00:14:46.438 "uuid": "50a71ec3-0dea-4895-a101-610854905f3d", 00:14:46.438 "strip_size_kb": 64, 00:14:46.438 "state": "offline", 00:14:46.438 "raid_level": "raid0", 00:14:46.438 "superblock": false, 00:14:46.438 "num_base_bdevs": 3, 00:14:46.438 "num_base_bdevs_discovered": 2, 00:14:46.438 "num_base_bdevs_operational": 2, 00:14:46.438 "base_bdevs_list": [ 00:14:46.438 { 00:14:46.438 "name": null, 00:14:46.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.438 "is_configured": false, 00:14:46.438 "data_offset": 0, 00:14:46.438 "data_size": 65536 00:14:46.438 }, 00:14:46.438 { 00:14:46.438 "name": "BaseBdev2", 00:14:46.438 "uuid": "aa3ff959-5bf6-452d-a13e-0cce727e058c", 00:14:46.438 "is_configured": true, 00:14:46.438 "data_offset": 0, 00:14:46.438 "data_size": 65536 00:14:46.438 }, 00:14:46.438 { 00:14:46.438 "name": "BaseBdev3", 00:14:46.438 "uuid": "6210dd23-8258-420f-b6af-db62d983d70c", 00:14:46.438 "is_configured": true, 00:14:46.438 "data_offset": 0, 00:14:46.438 "data_size": 65536 00:14:46.438 } 00:14:46.438 ] 00:14:46.438 }' 00:14:46.438 07:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.438 07:20:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.007 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:47.007 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:47.007 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.007 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:47.266 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:47.267 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:47.267 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:47.267 [2024-07-25 07:20:19.791051] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:47.526 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:47.526 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:47.526 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.526 07:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:47.526 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:47.526 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:47.526 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:47.785 [2024-07-25 07:20:20.262820] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:47.785 [2024-07-25 07:20:20.262869] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x203a680 name Existed_Raid, state offline 00:14:47.785 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:47.785 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:47.785 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.785 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:48.353 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:48.353 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:48.353 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:48.353 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:48.353 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:48.353 07:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:48.611 BaseBdev2 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:48.611 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.870 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:49.129 [ 00:14:49.129 { 00:14:49.129 "name": "BaseBdev2", 00:14:49.129 "aliases": [ 00:14:49.129 "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42" 00:14:49.129 ], 00:14:49.129 "product_name": "Malloc disk", 00:14:49.129 "block_size": 512, 00:14:49.129 "num_blocks": 65536, 00:14:49.129 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:49.129 "assigned_rate_limits": { 00:14:49.129 "rw_ios_per_sec": 0, 00:14:49.129 "rw_mbytes_per_sec": 0, 00:14:49.129 "r_mbytes_per_sec": 0, 00:14:49.129 "w_mbytes_per_sec": 0 00:14:49.129 }, 00:14:49.129 "claimed": false, 00:14:49.129 "zoned": false, 00:14:49.129 "supported_io_types": { 00:14:49.129 "read": true, 00:14:49.129 "write": true, 00:14:49.129 "unmap": true, 00:14:49.129 "flush": true, 00:14:49.129 "reset": true, 00:14:49.129 "nvme_admin": false, 00:14:49.129 "nvme_io": false, 00:14:49.129 "nvme_io_md": false, 00:14:49.129 "write_zeroes": true, 00:14:49.129 "zcopy": true, 00:14:49.129 "get_zone_info": false, 00:14:49.129 "zone_management": false, 00:14:49.129 "zone_append": false, 00:14:49.129 "compare": false, 00:14:49.129 "compare_and_write": false, 00:14:49.129 "abort": true, 00:14:49.129 "seek_hole": false, 00:14:49.129 "seek_data": false, 00:14:49.129 "copy": true, 00:14:49.129 "nvme_iov_md": false 00:14:49.129 }, 00:14:49.129 "memory_domains": [ 00:14:49.129 { 00:14:49.129 "dma_device_id": "system", 00:14:49.129 "dma_device_type": 1 00:14:49.129 }, 00:14:49.129 { 00:14:49.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.129 "dma_device_type": 2 00:14:49.129 } 00:14:49.129 ], 00:14:49.129 "driver_specific": {} 00:14:49.129 } 00:14:49.129 ] 00:14:49.129 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:49.129 07:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:49.129 07:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:49.129 07:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:49.388 BaseBdev3 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:49.388 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:49.647 07:20:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:49.647 [ 00:14:49.647 { 00:14:49.647 "name": "BaseBdev3", 00:14:49.647 "aliases": [ 00:14:49.647 "9ab15fe5-4f0e-4772-895b-02c9afb2efde" 00:14:49.647 ], 00:14:49.647 "product_name": "Malloc disk", 00:14:49.647 "block_size": 512, 00:14:49.647 "num_blocks": 65536, 00:14:49.647 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:49.647 "assigned_rate_limits": { 00:14:49.647 "rw_ios_per_sec": 0, 00:14:49.647 "rw_mbytes_per_sec": 0, 00:14:49.647 "r_mbytes_per_sec": 0, 00:14:49.647 "w_mbytes_per_sec": 0 00:14:49.647 }, 00:14:49.647 "claimed": false, 00:14:49.647 "zoned": false, 00:14:49.647 "supported_io_types": { 00:14:49.647 "read": true, 00:14:49.647 "write": true, 00:14:49.647 "unmap": true, 00:14:49.647 "flush": true, 00:14:49.647 "reset": true, 00:14:49.647 "nvme_admin": false, 00:14:49.647 "nvme_io": false, 00:14:49.647 "nvme_io_md": false, 00:14:49.647 "write_zeroes": true, 00:14:49.647 "zcopy": true, 00:14:49.647 "get_zone_info": false, 00:14:49.647 "zone_management": false, 00:14:49.647 "zone_append": false, 00:14:49.647 "compare": false, 00:14:49.647 "compare_and_write": false, 00:14:49.647 "abort": true, 00:14:49.647 "seek_hole": false, 00:14:49.647 "seek_data": false, 00:14:49.647 "copy": true, 00:14:49.647 "nvme_iov_md": false 00:14:49.647 }, 00:14:49.647 "memory_domains": [ 00:14:49.647 { 00:14:49.647 "dma_device_id": "system", 00:14:49.647 "dma_device_type": 1 00:14:49.647 }, 00:14:49.647 { 00:14:49.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.647 "dma_device_type": 2 00:14:49.647 } 00:14:49.647 ], 00:14:49.647 "driver_specific": {} 00:14:49.647 } 00:14:49.647 ] 00:14:49.647 07:20:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:49.647 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:49.647 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:49.647 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:49.906 [2024-07-25 07:20:22.371025] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:49.906 [2024-07-25 07:20:22.371066] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:49.906 [2024-07-25 07:20:22.371085] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:49.906 [2024-07-25 07:20:22.372320] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.906 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.165 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.165 "name": "Existed_Raid", 00:14:50.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.165 "strip_size_kb": 64, 00:14:50.165 "state": "configuring", 00:14:50.165 "raid_level": "raid0", 00:14:50.165 "superblock": false, 00:14:50.165 "num_base_bdevs": 3, 00:14:50.165 "num_base_bdevs_discovered": 2, 00:14:50.165 "num_base_bdevs_operational": 3, 00:14:50.165 "base_bdevs_list": [ 00:14:50.165 { 00:14:50.165 "name": "BaseBdev1", 00:14:50.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.165 "is_configured": false, 00:14:50.165 "data_offset": 0, 00:14:50.165 "data_size": 0 00:14:50.165 }, 00:14:50.165 { 00:14:50.165 "name": "BaseBdev2", 00:14:50.165 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:50.165 "is_configured": true, 00:14:50.165 "data_offset": 0, 00:14:50.165 "data_size": 65536 00:14:50.165 }, 00:14:50.165 { 00:14:50.165 "name": "BaseBdev3", 00:14:50.165 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:50.165 "is_configured": true, 00:14:50.165 "data_offset": 0, 00:14:50.165 "data_size": 65536 00:14:50.165 } 00:14:50.165 ] 00:14:50.165 }' 00:14:50.165 07:20:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.165 07:20:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.732 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:50.990 [2024-07-25 07:20:23.309470] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.990 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.248 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.248 "name": "Existed_Raid", 00:14:51.248 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.248 "strip_size_kb": 64, 00:14:51.248 "state": "configuring", 00:14:51.248 "raid_level": "raid0", 00:14:51.248 "superblock": false, 00:14:51.248 "num_base_bdevs": 3, 00:14:51.248 "num_base_bdevs_discovered": 1, 00:14:51.248 "num_base_bdevs_operational": 3, 00:14:51.248 "base_bdevs_list": [ 00:14:51.248 { 00:14:51.248 "name": "BaseBdev1", 00:14:51.248 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.248 "is_configured": false, 00:14:51.248 "data_offset": 0, 00:14:51.248 "data_size": 0 00:14:51.248 }, 00:14:51.248 { 00:14:51.248 "name": null, 00:14:51.248 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:51.248 "is_configured": false, 00:14:51.248 "data_offset": 0, 00:14:51.248 "data_size": 65536 00:14:51.248 }, 00:14:51.248 { 00:14:51.248 "name": "BaseBdev3", 00:14:51.248 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:51.248 "is_configured": true, 00:14:51.248 "data_offset": 0, 00:14:51.248 "data_size": 65536 00:14:51.248 } 00:14:51.248 ] 00:14:51.248 }' 00:14:51.248 07:20:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.248 07:20:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.816 07:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.816 07:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:51.816 07:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:51.816 07:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:52.384 [2024-07-25 07:20:24.820708] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:52.384 BaseBdev1 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:52.384 07:20:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:52.952 07:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:53.210 [ 00:14:53.210 { 00:14:53.210 "name": "BaseBdev1", 00:14:53.210 "aliases": [ 00:14:53.210 "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6" 00:14:53.210 ], 00:14:53.210 "product_name": "Malloc disk", 00:14:53.210 "block_size": 512, 00:14:53.210 "num_blocks": 65536, 00:14:53.210 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:53.210 "assigned_rate_limits": { 00:14:53.210 "rw_ios_per_sec": 0, 00:14:53.210 "rw_mbytes_per_sec": 0, 00:14:53.210 "r_mbytes_per_sec": 0, 00:14:53.210 "w_mbytes_per_sec": 0 00:14:53.210 }, 00:14:53.210 "claimed": true, 00:14:53.210 "claim_type": "exclusive_write", 00:14:53.210 "zoned": false, 00:14:53.210 "supported_io_types": { 00:14:53.210 "read": true, 00:14:53.210 "write": true, 00:14:53.210 "unmap": true, 00:14:53.210 "flush": true, 00:14:53.210 "reset": true, 00:14:53.210 "nvme_admin": false, 00:14:53.210 "nvme_io": false, 00:14:53.210 "nvme_io_md": false, 00:14:53.210 "write_zeroes": true, 00:14:53.210 "zcopy": true, 00:14:53.210 "get_zone_info": false, 00:14:53.210 "zone_management": false, 00:14:53.210 "zone_append": false, 00:14:53.210 "compare": false, 00:14:53.210 "compare_and_write": false, 00:14:53.210 "abort": true, 00:14:53.210 "seek_hole": false, 00:14:53.210 "seek_data": false, 00:14:53.210 "copy": true, 00:14:53.210 "nvme_iov_md": false 00:14:53.210 }, 00:14:53.210 "memory_domains": [ 00:14:53.210 { 00:14:53.210 "dma_device_id": "system", 00:14:53.210 "dma_device_type": 1 00:14:53.210 }, 00:14:53.210 { 00:14:53.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.210 "dma_device_type": 2 00:14:53.210 } 00:14:53.210 ], 00:14:53.210 "driver_specific": {} 00:14:53.210 } 00:14:53.210 ] 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.210 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.468 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.468 "name": "Existed_Raid", 00:14:53.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.468 "strip_size_kb": 64, 00:14:53.468 "state": "configuring", 00:14:53.468 "raid_level": "raid0", 00:14:53.468 "superblock": false, 00:14:53.468 "num_base_bdevs": 3, 00:14:53.468 "num_base_bdevs_discovered": 2, 00:14:53.468 "num_base_bdevs_operational": 3, 00:14:53.468 "base_bdevs_list": [ 00:14:53.468 { 00:14:53.468 "name": "BaseBdev1", 00:14:53.468 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:53.468 "is_configured": true, 00:14:53.468 "data_offset": 0, 00:14:53.468 "data_size": 65536 00:14:53.468 }, 00:14:53.468 { 00:14:53.468 "name": null, 00:14:53.468 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:53.468 "is_configured": false, 00:14:53.468 "data_offset": 0, 00:14:53.468 "data_size": 65536 00:14:53.469 }, 00:14:53.469 { 00:14:53.469 "name": "BaseBdev3", 00:14:53.469 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:53.469 "is_configured": true, 00:14:53.469 "data_offset": 0, 00:14:53.469 "data_size": 65536 00:14:53.469 } 00:14:53.469 ] 00:14:53.469 }' 00:14:53.469 07:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.469 07:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.036 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.036 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:54.036 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:54.036 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:54.294 [2024-07-25 07:20:26.741793] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.294 07:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.553 07:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.553 "name": "Existed_Raid", 00:14:54.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.553 "strip_size_kb": 64, 00:14:54.553 "state": "configuring", 00:14:54.553 "raid_level": "raid0", 00:14:54.553 "superblock": false, 00:14:54.553 "num_base_bdevs": 3, 00:14:54.553 "num_base_bdevs_discovered": 1, 00:14:54.553 "num_base_bdevs_operational": 3, 00:14:54.553 "base_bdevs_list": [ 00:14:54.553 { 00:14:54.553 "name": "BaseBdev1", 00:14:54.553 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:54.553 "is_configured": true, 00:14:54.553 "data_offset": 0, 00:14:54.553 "data_size": 65536 00:14:54.553 }, 00:14:54.553 { 00:14:54.553 "name": null, 00:14:54.553 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:54.553 "is_configured": false, 00:14:54.553 "data_offset": 0, 00:14:54.553 "data_size": 65536 00:14:54.553 }, 00:14:54.553 { 00:14:54.553 "name": null, 00:14:54.553 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:54.553 "is_configured": false, 00:14:54.553 "data_offset": 0, 00:14:54.553 "data_size": 65536 00:14:54.553 } 00:14:54.553 ] 00:14:54.553 }' 00:14:54.553 07:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.553 07:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.173 07:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.173 07:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:55.431 07:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:55.431 07:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:55.689 [2024-07-25 07:20:28.025207] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.689 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.690 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.690 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.690 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.948 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.948 "name": "Existed_Raid", 00:14:55.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.948 "strip_size_kb": 64, 00:14:55.948 "state": "configuring", 00:14:55.948 "raid_level": "raid0", 00:14:55.948 "superblock": false, 00:14:55.948 "num_base_bdevs": 3, 00:14:55.948 "num_base_bdevs_discovered": 2, 00:14:55.948 "num_base_bdevs_operational": 3, 00:14:55.948 "base_bdevs_list": [ 00:14:55.948 { 00:14:55.948 "name": "BaseBdev1", 00:14:55.948 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:55.948 "is_configured": true, 00:14:55.948 "data_offset": 0, 00:14:55.948 "data_size": 65536 00:14:55.948 }, 00:14:55.948 { 00:14:55.948 "name": null, 00:14:55.948 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:55.948 "is_configured": false, 00:14:55.948 "data_offset": 0, 00:14:55.948 "data_size": 65536 00:14:55.948 }, 00:14:55.948 { 00:14:55.948 "name": "BaseBdev3", 00:14:55.948 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:55.948 "is_configured": true, 00:14:55.948 "data_offset": 0, 00:14:55.948 "data_size": 65536 00:14:55.948 } 00:14:55.948 ] 00:14:55.948 }' 00:14:55.948 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.948 07:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.514 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.514 07:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:56.772 [2024-07-25 07:20:29.276522] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.772 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.031 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.031 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:57.031 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.031 "name": "Existed_Raid", 00:14:57.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.031 "strip_size_kb": 64, 00:14:57.031 "state": "configuring", 00:14:57.031 "raid_level": "raid0", 00:14:57.031 "superblock": false, 00:14:57.031 "num_base_bdevs": 3, 00:14:57.031 "num_base_bdevs_discovered": 1, 00:14:57.031 "num_base_bdevs_operational": 3, 00:14:57.031 "base_bdevs_list": [ 00:14:57.031 { 00:14:57.031 "name": null, 00:14:57.031 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:57.031 "is_configured": false, 00:14:57.031 "data_offset": 0, 00:14:57.031 "data_size": 65536 00:14:57.031 }, 00:14:57.031 { 00:14:57.031 "name": null, 00:14:57.031 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:57.031 "is_configured": false, 00:14:57.031 "data_offset": 0, 00:14:57.031 "data_size": 65536 00:14:57.031 }, 00:14:57.031 { 00:14:57.031 "name": "BaseBdev3", 00:14:57.031 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:57.031 "is_configured": true, 00:14:57.031 "data_offset": 0, 00:14:57.031 "data_size": 65536 00:14:57.031 } 00:14:57.031 ] 00:14:57.031 }' 00:14:57.031 07:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.031 07:20:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.599 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:57.599 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.857 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:57.857 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:58.116 [2024-07-25 07:20:30.533976] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.116 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.375 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.375 "name": "Existed_Raid", 00:14:58.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.375 "strip_size_kb": 64, 00:14:58.375 "state": "configuring", 00:14:58.375 "raid_level": "raid0", 00:14:58.375 "superblock": false, 00:14:58.375 "num_base_bdevs": 3, 00:14:58.375 "num_base_bdevs_discovered": 2, 00:14:58.375 "num_base_bdevs_operational": 3, 00:14:58.375 "base_bdevs_list": [ 00:14:58.375 { 00:14:58.375 "name": null, 00:14:58.375 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:58.375 "is_configured": false, 00:14:58.375 "data_offset": 0, 00:14:58.375 "data_size": 65536 00:14:58.375 }, 00:14:58.375 { 00:14:58.375 "name": "BaseBdev2", 00:14:58.375 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:14:58.376 "is_configured": true, 00:14:58.376 "data_offset": 0, 00:14:58.376 "data_size": 65536 00:14:58.376 }, 00:14:58.376 { 00:14:58.376 "name": "BaseBdev3", 00:14:58.376 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:14:58.376 "is_configured": true, 00:14:58.376 "data_offset": 0, 00:14:58.376 "data_size": 65536 00:14:58.376 } 00:14:58.376 ] 00:14:58.376 }' 00:14:58.376 07:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.376 07:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.944 07:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.944 07:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:59.202 07:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:59.202 07:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.202 07:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:59.462 07:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b0a88d7f-c45c-47f9-bf34-7970fa58f5a6 00:14:59.721 [2024-07-25 07:20:32.037172] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:59.721 [2024-07-25 07:20:32.037207] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2038680 00:14:59.721 [2024-07-25 07:20:32.037215] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:59.721 [2024-07-25 07:20:32.037393] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20266f0 00:14:59.721 [2024-07-25 07:20:32.037498] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2038680 00:14:59.721 [2024-07-25 07:20:32.037507] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2038680 00:14:59.721 [2024-07-25 07:20:32.037656] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:59.721 NewBaseBdev 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:59.721 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:59.980 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:59.980 [ 00:14:59.980 { 00:14:59.980 "name": "NewBaseBdev", 00:14:59.980 "aliases": [ 00:14:59.980 "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6" 00:14:59.980 ], 00:14:59.980 "product_name": "Malloc disk", 00:14:59.980 "block_size": 512, 00:14:59.980 "num_blocks": 65536, 00:14:59.980 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:14:59.980 "assigned_rate_limits": { 00:14:59.980 "rw_ios_per_sec": 0, 00:14:59.980 "rw_mbytes_per_sec": 0, 00:14:59.980 "r_mbytes_per_sec": 0, 00:14:59.980 "w_mbytes_per_sec": 0 00:14:59.980 }, 00:14:59.980 "claimed": true, 00:14:59.980 "claim_type": "exclusive_write", 00:14:59.980 "zoned": false, 00:14:59.980 "supported_io_types": { 00:14:59.980 "read": true, 00:14:59.980 "write": true, 00:14:59.980 "unmap": true, 00:14:59.980 "flush": true, 00:14:59.980 "reset": true, 00:14:59.980 "nvme_admin": false, 00:14:59.980 "nvme_io": false, 00:14:59.980 "nvme_io_md": false, 00:14:59.980 "write_zeroes": true, 00:14:59.980 "zcopy": true, 00:14:59.980 "get_zone_info": false, 00:14:59.980 "zone_management": false, 00:14:59.980 "zone_append": false, 00:14:59.980 "compare": false, 00:14:59.980 "compare_and_write": false, 00:14:59.980 "abort": true, 00:14:59.980 "seek_hole": false, 00:14:59.980 "seek_data": false, 00:14:59.980 "copy": true, 00:14:59.980 "nvme_iov_md": false 00:14:59.980 }, 00:14:59.980 "memory_domains": [ 00:14:59.980 { 00:14:59.980 "dma_device_id": "system", 00:14:59.980 "dma_device_type": 1 00:14:59.980 }, 00:14:59.980 { 00:14:59.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.980 "dma_device_type": 2 00:14:59.980 } 00:14:59.980 ], 00:14:59.980 "driver_specific": {} 00:14:59.980 } 00:14:59.980 ] 00:14:59.980 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:59.980 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:59.980 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.980 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:59.980 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:00.239 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.239 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:00.239 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.240 "name": "Existed_Raid", 00:15:00.240 "uuid": "75bee40c-f32e-4704-a002-6c24d413835e", 00:15:00.240 "strip_size_kb": 64, 00:15:00.240 "state": "online", 00:15:00.240 "raid_level": "raid0", 00:15:00.240 "superblock": false, 00:15:00.240 "num_base_bdevs": 3, 00:15:00.240 "num_base_bdevs_discovered": 3, 00:15:00.240 "num_base_bdevs_operational": 3, 00:15:00.240 "base_bdevs_list": [ 00:15:00.240 { 00:15:00.240 "name": "NewBaseBdev", 00:15:00.240 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:15:00.240 "is_configured": true, 00:15:00.240 "data_offset": 0, 00:15:00.240 "data_size": 65536 00:15:00.240 }, 00:15:00.240 { 00:15:00.240 "name": "BaseBdev2", 00:15:00.240 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:15:00.240 "is_configured": true, 00:15:00.240 "data_offset": 0, 00:15:00.240 "data_size": 65536 00:15:00.240 }, 00:15:00.240 { 00:15:00.240 "name": "BaseBdev3", 00:15:00.240 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:15:00.240 "is_configured": true, 00:15:00.240 "data_offset": 0, 00:15:00.240 "data_size": 65536 00:15:00.240 } 00:15:00.240 ] 00:15:00.240 }' 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.240 07:20:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:00.808 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:01.066 [2024-07-25 07:20:33.521376] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:01.066 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:01.066 "name": "Existed_Raid", 00:15:01.066 "aliases": [ 00:15:01.066 "75bee40c-f32e-4704-a002-6c24d413835e" 00:15:01.066 ], 00:15:01.066 "product_name": "Raid Volume", 00:15:01.066 "block_size": 512, 00:15:01.066 "num_blocks": 196608, 00:15:01.067 "uuid": "75bee40c-f32e-4704-a002-6c24d413835e", 00:15:01.067 "assigned_rate_limits": { 00:15:01.067 "rw_ios_per_sec": 0, 00:15:01.067 "rw_mbytes_per_sec": 0, 00:15:01.067 "r_mbytes_per_sec": 0, 00:15:01.067 "w_mbytes_per_sec": 0 00:15:01.067 }, 00:15:01.067 "claimed": false, 00:15:01.067 "zoned": false, 00:15:01.067 "supported_io_types": { 00:15:01.067 "read": true, 00:15:01.067 "write": true, 00:15:01.067 "unmap": true, 00:15:01.067 "flush": true, 00:15:01.067 "reset": true, 00:15:01.067 "nvme_admin": false, 00:15:01.067 "nvme_io": false, 00:15:01.067 "nvme_io_md": false, 00:15:01.067 "write_zeroes": true, 00:15:01.067 "zcopy": false, 00:15:01.067 "get_zone_info": false, 00:15:01.067 "zone_management": false, 00:15:01.067 "zone_append": false, 00:15:01.067 "compare": false, 00:15:01.067 "compare_and_write": false, 00:15:01.067 "abort": false, 00:15:01.067 "seek_hole": false, 00:15:01.067 "seek_data": false, 00:15:01.067 "copy": false, 00:15:01.067 "nvme_iov_md": false 00:15:01.067 }, 00:15:01.067 "memory_domains": [ 00:15:01.067 { 00:15:01.067 "dma_device_id": "system", 00:15:01.067 "dma_device_type": 1 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.067 "dma_device_type": 2 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "dma_device_id": "system", 00:15:01.067 "dma_device_type": 1 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.067 "dma_device_type": 2 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "dma_device_id": "system", 00:15:01.067 "dma_device_type": 1 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.067 "dma_device_type": 2 00:15:01.067 } 00:15:01.067 ], 00:15:01.067 "driver_specific": { 00:15:01.067 "raid": { 00:15:01.067 "uuid": "75bee40c-f32e-4704-a002-6c24d413835e", 00:15:01.067 "strip_size_kb": 64, 00:15:01.067 "state": "online", 00:15:01.067 "raid_level": "raid0", 00:15:01.067 "superblock": false, 00:15:01.067 "num_base_bdevs": 3, 00:15:01.067 "num_base_bdevs_discovered": 3, 00:15:01.067 "num_base_bdevs_operational": 3, 00:15:01.067 "base_bdevs_list": [ 00:15:01.067 { 00:15:01.067 "name": "NewBaseBdev", 00:15:01.067 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:15:01.067 "is_configured": true, 00:15:01.067 "data_offset": 0, 00:15:01.067 "data_size": 65536 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "name": "BaseBdev2", 00:15:01.067 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:15:01.067 "is_configured": true, 00:15:01.067 "data_offset": 0, 00:15:01.067 "data_size": 65536 00:15:01.067 }, 00:15:01.067 { 00:15:01.067 "name": "BaseBdev3", 00:15:01.067 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:15:01.067 "is_configured": true, 00:15:01.067 "data_offset": 0, 00:15:01.067 "data_size": 65536 00:15:01.067 } 00:15:01.067 ] 00:15:01.067 } 00:15:01.067 } 00:15:01.067 }' 00:15:01.067 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:01.067 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:01.067 BaseBdev2 00:15:01.067 BaseBdev3' 00:15:01.067 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.067 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:01.067 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:01.325 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:01.325 "name": "NewBaseBdev", 00:15:01.325 "aliases": [ 00:15:01.325 "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6" 00:15:01.325 ], 00:15:01.325 "product_name": "Malloc disk", 00:15:01.325 "block_size": 512, 00:15:01.325 "num_blocks": 65536, 00:15:01.325 "uuid": "b0a88d7f-c45c-47f9-bf34-7970fa58f5a6", 00:15:01.325 "assigned_rate_limits": { 00:15:01.325 "rw_ios_per_sec": 0, 00:15:01.325 "rw_mbytes_per_sec": 0, 00:15:01.325 "r_mbytes_per_sec": 0, 00:15:01.325 "w_mbytes_per_sec": 0 00:15:01.325 }, 00:15:01.325 "claimed": true, 00:15:01.325 "claim_type": "exclusive_write", 00:15:01.325 "zoned": false, 00:15:01.325 "supported_io_types": { 00:15:01.325 "read": true, 00:15:01.325 "write": true, 00:15:01.325 "unmap": true, 00:15:01.325 "flush": true, 00:15:01.325 "reset": true, 00:15:01.325 "nvme_admin": false, 00:15:01.325 "nvme_io": false, 00:15:01.325 "nvme_io_md": false, 00:15:01.325 "write_zeroes": true, 00:15:01.325 "zcopy": true, 00:15:01.325 "get_zone_info": false, 00:15:01.325 "zone_management": false, 00:15:01.325 "zone_append": false, 00:15:01.325 "compare": false, 00:15:01.325 "compare_and_write": false, 00:15:01.325 "abort": true, 00:15:01.325 "seek_hole": false, 00:15:01.325 "seek_data": false, 00:15:01.325 "copy": true, 00:15:01.325 "nvme_iov_md": false 00:15:01.325 }, 00:15:01.325 "memory_domains": [ 00:15:01.325 { 00:15:01.325 "dma_device_id": "system", 00:15:01.325 "dma_device_type": 1 00:15:01.325 }, 00:15:01.325 { 00:15:01.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.325 "dma_device_type": 2 00:15:01.325 } 00:15:01.325 ], 00:15:01.325 "driver_specific": {} 00:15:01.325 }' 00:15:01.325 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:01.584 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:01.584 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:01.584 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.584 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.584 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:01.584 07:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.584 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.584 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.584 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.584 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.843 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.843 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.843 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:01.843 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:01.843 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:01.843 "name": "BaseBdev2", 00:15:01.843 "aliases": [ 00:15:01.843 "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42" 00:15:01.843 ], 00:15:01.843 "product_name": "Malloc disk", 00:15:01.843 "block_size": 512, 00:15:01.843 "num_blocks": 65536, 00:15:01.843 "uuid": "40c1a1f3-bc16-4bdf-8a51-fe00e45e1e42", 00:15:01.843 "assigned_rate_limits": { 00:15:01.843 "rw_ios_per_sec": 0, 00:15:01.843 "rw_mbytes_per_sec": 0, 00:15:01.843 "r_mbytes_per_sec": 0, 00:15:01.843 "w_mbytes_per_sec": 0 00:15:01.843 }, 00:15:01.843 "claimed": true, 00:15:01.843 "claim_type": "exclusive_write", 00:15:01.843 "zoned": false, 00:15:01.843 "supported_io_types": { 00:15:01.843 "read": true, 00:15:01.843 "write": true, 00:15:01.843 "unmap": true, 00:15:01.843 "flush": true, 00:15:01.843 "reset": true, 00:15:01.843 "nvme_admin": false, 00:15:01.843 "nvme_io": false, 00:15:01.843 "nvme_io_md": false, 00:15:01.843 "write_zeroes": true, 00:15:01.843 "zcopy": true, 00:15:01.843 "get_zone_info": false, 00:15:01.843 "zone_management": false, 00:15:01.843 "zone_append": false, 00:15:01.843 "compare": false, 00:15:01.843 "compare_and_write": false, 00:15:01.843 "abort": true, 00:15:01.843 "seek_hole": false, 00:15:01.843 "seek_data": false, 00:15:01.843 "copy": true, 00:15:01.843 "nvme_iov_md": false 00:15:01.843 }, 00:15:01.843 "memory_domains": [ 00:15:01.843 { 00:15:01.843 "dma_device_id": "system", 00:15:01.843 "dma_device_type": 1 00:15:01.843 }, 00:15:01.843 { 00:15:01.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.843 "dma_device_type": 2 00:15:01.843 } 00:15:01.843 ], 00:15:01.843 "driver_specific": {} 00:15:01.843 }' 00:15:01.843 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.102 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.361 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.361 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.361 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:02.361 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:02.361 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.621 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.621 "name": "BaseBdev3", 00:15:02.621 "aliases": [ 00:15:02.621 "9ab15fe5-4f0e-4772-895b-02c9afb2efde" 00:15:02.621 ], 00:15:02.621 "product_name": "Malloc disk", 00:15:02.621 "block_size": 512, 00:15:02.621 "num_blocks": 65536, 00:15:02.621 "uuid": "9ab15fe5-4f0e-4772-895b-02c9afb2efde", 00:15:02.621 "assigned_rate_limits": { 00:15:02.621 "rw_ios_per_sec": 0, 00:15:02.621 "rw_mbytes_per_sec": 0, 00:15:02.621 "r_mbytes_per_sec": 0, 00:15:02.621 "w_mbytes_per_sec": 0 00:15:02.621 }, 00:15:02.621 "claimed": true, 00:15:02.621 "claim_type": "exclusive_write", 00:15:02.621 "zoned": false, 00:15:02.621 "supported_io_types": { 00:15:02.621 "read": true, 00:15:02.621 "write": true, 00:15:02.621 "unmap": true, 00:15:02.621 "flush": true, 00:15:02.621 "reset": true, 00:15:02.621 "nvme_admin": false, 00:15:02.621 "nvme_io": false, 00:15:02.621 "nvme_io_md": false, 00:15:02.621 "write_zeroes": true, 00:15:02.621 "zcopy": true, 00:15:02.621 "get_zone_info": false, 00:15:02.621 "zone_management": false, 00:15:02.621 "zone_append": false, 00:15:02.621 "compare": false, 00:15:02.621 "compare_and_write": false, 00:15:02.621 "abort": true, 00:15:02.621 "seek_hole": false, 00:15:02.621 "seek_data": false, 00:15:02.621 "copy": true, 00:15:02.621 "nvme_iov_md": false 00:15:02.621 }, 00:15:02.621 "memory_domains": [ 00:15:02.621 { 00:15:02.621 "dma_device_id": "system", 00:15:02.621 "dma_device_type": 1 00:15:02.621 }, 00:15:02.621 { 00:15:02.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.621 "dma_device_type": 2 00:15:02.621 } 00:15:02.621 ], 00:15:02.621 "driver_specific": {} 00:15:02.621 }' 00:15:02.621 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.621 07:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.621 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.621 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.621 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.621 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.621 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.621 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.881 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.881 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.881 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.881 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.881 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:03.140 [2024-07-25 07:20:35.454234] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:03.140 [2024-07-25 07:20:35.454260] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:03.140 [2024-07-25 07:20:35.454314] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:03.140 [2024-07-25 07:20:35.454367] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:03.140 [2024-07-25 07:20:35.454379] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2038680 name Existed_Raid, state offline 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1611883 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1611883 ']' 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1611883 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1611883 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1611883' 00:15:03.140 killing process with pid 1611883 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1611883 00:15:03.140 [2024-07-25 07:20:35.529123] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:03.140 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1611883 00:15:03.140 [2024-07-25 07:20:35.551814] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:03.400 00:15:03.400 real 0m27.508s 00:15:03.400 user 0m50.547s 00:15:03.400 sys 0m4.827s 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.400 ************************************ 00:15:03.400 END TEST raid_state_function_test 00:15:03.400 ************************************ 00:15:03.400 07:20:35 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:15:03.400 07:20:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:03.400 07:20:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:03.400 07:20:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:03.400 ************************************ 00:15:03.400 START TEST raid_state_function_test_sb 00:15:03.400 ************************************ 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 true 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1617240 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1617240' 00:15:03.400 Process raid pid: 1617240 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1617240 /var/tmp/spdk-raid.sock 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1617240 ']' 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:03.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:03.400 07:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:03.400 [2024-07-25 07:20:35.901687] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:15:03.400 [2024-07-25 07:20:35.901747] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:03.660 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.660 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:03.660 [2024-07-25 07:20:36.026473] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:03.660 [2024-07-25 07:20:36.111865] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:15:03.660 [2024-07-25 07:20:36.170516] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:03.660 [2024-07-25 07:20:36.170551] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:04.597 07:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:04.597 07:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:15:04.597 07:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:04.597 [2024-07-25 07:20:37.001299] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:04.597 [2024-07-25 07:20:37.001335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:04.597 [2024-07-25 07:20:37.001344] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:04.597 [2024-07-25 07:20:37.001355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:04.597 [2024-07-25 07:20:37.001363] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:04.597 [2024-07-25 07:20:37.001372] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.597 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.856 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.856 "name": "Existed_Raid", 00:15:04.856 "uuid": "df31ee3e-f966-4247-9d72-8c1dda09b1e5", 00:15:04.856 "strip_size_kb": 64, 00:15:04.856 "state": "configuring", 00:15:04.856 "raid_level": "raid0", 00:15:04.856 "superblock": true, 00:15:04.856 "num_base_bdevs": 3, 00:15:04.856 "num_base_bdevs_discovered": 0, 00:15:04.856 "num_base_bdevs_operational": 3, 00:15:04.856 "base_bdevs_list": [ 00:15:04.856 { 00:15:04.856 "name": "BaseBdev1", 00:15:04.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.856 "is_configured": false, 00:15:04.856 "data_offset": 0, 00:15:04.856 "data_size": 0 00:15:04.856 }, 00:15:04.856 { 00:15:04.856 "name": "BaseBdev2", 00:15:04.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.856 "is_configured": false, 00:15:04.856 "data_offset": 0, 00:15:04.856 "data_size": 0 00:15:04.856 }, 00:15:04.856 { 00:15:04.856 "name": "BaseBdev3", 00:15:04.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.856 "is_configured": false, 00:15:04.856 "data_offset": 0, 00:15:04.856 "data_size": 0 00:15:04.856 } 00:15:04.856 ] 00:15:04.856 }' 00:15:04.856 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.856 07:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:05.423 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:05.682 [2024-07-25 07:20:37.979735] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:05.682 [2024-07-25 07:20:37.979760] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15c7ec0 name Existed_Raid, state configuring 00:15:05.682 07:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:05.682 [2024-07-25 07:20:38.136174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:05.682 [2024-07-25 07:20:38.136199] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:05.682 [2024-07-25 07:20:38.136208] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:05.682 [2024-07-25 07:20:38.136218] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:05.682 [2024-07-25 07:20:38.136226] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:05.682 [2024-07-25 07:20:38.136236] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:05.682 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:05.941 [2024-07-25 07:20:38.314088] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:05.941 BaseBdev1 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:05.941 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:06.200 [ 00:15:06.200 { 00:15:06.200 "name": "BaseBdev1", 00:15:06.200 "aliases": [ 00:15:06.200 "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3" 00:15:06.200 ], 00:15:06.200 "product_name": "Malloc disk", 00:15:06.200 "block_size": 512, 00:15:06.200 "num_blocks": 65536, 00:15:06.200 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:06.200 "assigned_rate_limits": { 00:15:06.200 "rw_ios_per_sec": 0, 00:15:06.200 "rw_mbytes_per_sec": 0, 00:15:06.200 "r_mbytes_per_sec": 0, 00:15:06.200 "w_mbytes_per_sec": 0 00:15:06.200 }, 00:15:06.200 "claimed": true, 00:15:06.200 "claim_type": "exclusive_write", 00:15:06.200 "zoned": false, 00:15:06.200 "supported_io_types": { 00:15:06.200 "read": true, 00:15:06.200 "write": true, 00:15:06.200 "unmap": true, 00:15:06.200 "flush": true, 00:15:06.200 "reset": true, 00:15:06.200 "nvme_admin": false, 00:15:06.200 "nvme_io": false, 00:15:06.200 "nvme_io_md": false, 00:15:06.200 "write_zeroes": true, 00:15:06.200 "zcopy": true, 00:15:06.200 "get_zone_info": false, 00:15:06.200 "zone_management": false, 00:15:06.200 "zone_append": false, 00:15:06.200 "compare": false, 00:15:06.200 "compare_and_write": false, 00:15:06.200 "abort": true, 00:15:06.200 "seek_hole": false, 00:15:06.200 "seek_data": false, 00:15:06.200 "copy": true, 00:15:06.200 "nvme_iov_md": false 00:15:06.200 }, 00:15:06.200 "memory_domains": [ 00:15:06.200 { 00:15:06.200 "dma_device_id": "system", 00:15:06.200 "dma_device_type": 1 00:15:06.200 }, 00:15:06.200 { 00:15:06.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.200 "dma_device_type": 2 00:15:06.200 } 00:15:06.200 ], 00:15:06.200 "driver_specific": {} 00:15:06.200 } 00:15:06.200 ] 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.200 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.459 "name": "Existed_Raid", 00:15:06.459 "uuid": "5bb5655b-22f9-4001-90ae-af84c7fc3719", 00:15:06.459 "strip_size_kb": 64, 00:15:06.459 "state": "configuring", 00:15:06.459 "raid_level": "raid0", 00:15:06.459 "superblock": true, 00:15:06.459 "num_base_bdevs": 3, 00:15:06.459 "num_base_bdevs_discovered": 1, 00:15:06.459 "num_base_bdevs_operational": 3, 00:15:06.459 "base_bdevs_list": [ 00:15:06.459 { 00:15:06.459 "name": "BaseBdev1", 00:15:06.459 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:06.459 "is_configured": true, 00:15:06.459 "data_offset": 2048, 00:15:06.459 "data_size": 63488 00:15:06.459 }, 00:15:06.459 { 00:15:06.459 "name": "BaseBdev2", 00:15:06.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.459 "is_configured": false, 00:15:06.459 "data_offset": 0, 00:15:06.459 "data_size": 0 00:15:06.459 }, 00:15:06.459 { 00:15:06.459 "name": "BaseBdev3", 00:15:06.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.459 "is_configured": false, 00:15:06.459 "data_offset": 0, 00:15:06.459 "data_size": 0 00:15:06.459 } 00:15:06.459 ] 00:15:06.459 }' 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.459 07:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.027 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:07.285 [2024-07-25 07:20:39.613491] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:07.285 [2024-07-25 07:20:39.613524] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15c7790 name Existed_Raid, state configuring 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:07.285 [2024-07-25 07:20:39.781981] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:07.285 [2024-07-25 07:20:39.783342] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.285 [2024-07-25 07:20:39.783371] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.285 [2024-07-25 07:20:39.783380] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:07.285 [2024-07-25 07:20:39.783391] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.285 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.544 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.545 "name": "Existed_Raid", 00:15:07.545 "uuid": "d12b5182-5df2-491a-a2c4-b5512eabaa60", 00:15:07.545 "strip_size_kb": 64, 00:15:07.545 "state": "configuring", 00:15:07.545 "raid_level": "raid0", 00:15:07.545 "superblock": true, 00:15:07.545 "num_base_bdevs": 3, 00:15:07.545 "num_base_bdevs_discovered": 1, 00:15:07.545 "num_base_bdevs_operational": 3, 00:15:07.545 "base_bdevs_list": [ 00:15:07.545 { 00:15:07.545 "name": "BaseBdev1", 00:15:07.545 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:07.545 "is_configured": true, 00:15:07.545 "data_offset": 2048, 00:15:07.545 "data_size": 63488 00:15:07.545 }, 00:15:07.545 { 00:15:07.545 "name": "BaseBdev2", 00:15:07.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.545 "is_configured": false, 00:15:07.545 "data_offset": 0, 00:15:07.545 "data_size": 0 00:15:07.545 }, 00:15:07.545 { 00:15:07.545 "name": "BaseBdev3", 00:15:07.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.545 "is_configured": false, 00:15:07.545 "data_offset": 0, 00:15:07.545 "data_size": 0 00:15:07.545 } 00:15:07.545 ] 00:15:07.545 }' 00:15:07.545 07:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.545 07:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.169 07:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:08.435 [2024-07-25 07:20:40.719486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:08.435 BaseBdev2 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:08.435 07:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:08.695 [ 00:15:08.695 { 00:15:08.695 "name": "BaseBdev2", 00:15:08.695 "aliases": [ 00:15:08.695 "23eab470-950d-4b31-82cf-82fdfbde3613" 00:15:08.695 ], 00:15:08.695 "product_name": "Malloc disk", 00:15:08.695 "block_size": 512, 00:15:08.695 "num_blocks": 65536, 00:15:08.695 "uuid": "23eab470-950d-4b31-82cf-82fdfbde3613", 00:15:08.695 "assigned_rate_limits": { 00:15:08.695 "rw_ios_per_sec": 0, 00:15:08.695 "rw_mbytes_per_sec": 0, 00:15:08.695 "r_mbytes_per_sec": 0, 00:15:08.695 "w_mbytes_per_sec": 0 00:15:08.695 }, 00:15:08.695 "claimed": true, 00:15:08.695 "claim_type": "exclusive_write", 00:15:08.695 "zoned": false, 00:15:08.695 "supported_io_types": { 00:15:08.695 "read": true, 00:15:08.695 "write": true, 00:15:08.695 "unmap": true, 00:15:08.695 "flush": true, 00:15:08.695 "reset": true, 00:15:08.695 "nvme_admin": false, 00:15:08.695 "nvme_io": false, 00:15:08.695 "nvme_io_md": false, 00:15:08.695 "write_zeroes": true, 00:15:08.695 "zcopy": true, 00:15:08.695 "get_zone_info": false, 00:15:08.695 "zone_management": false, 00:15:08.695 "zone_append": false, 00:15:08.695 "compare": false, 00:15:08.695 "compare_and_write": false, 00:15:08.695 "abort": true, 00:15:08.695 "seek_hole": false, 00:15:08.695 "seek_data": false, 00:15:08.695 "copy": true, 00:15:08.695 "nvme_iov_md": false 00:15:08.695 }, 00:15:08.695 "memory_domains": [ 00:15:08.695 { 00:15:08.695 "dma_device_id": "system", 00:15:08.695 "dma_device_type": 1 00:15:08.695 }, 00:15:08.695 { 00:15:08.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.695 "dma_device_type": 2 00:15:08.695 } 00:15:08.695 ], 00:15:08.695 "driver_specific": {} 00:15:08.695 } 00:15:08.695 ] 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.695 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.954 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.954 "name": "Existed_Raid", 00:15:08.954 "uuid": "d12b5182-5df2-491a-a2c4-b5512eabaa60", 00:15:08.954 "strip_size_kb": 64, 00:15:08.954 "state": "configuring", 00:15:08.954 "raid_level": "raid0", 00:15:08.954 "superblock": true, 00:15:08.954 "num_base_bdevs": 3, 00:15:08.954 "num_base_bdevs_discovered": 2, 00:15:08.954 "num_base_bdevs_operational": 3, 00:15:08.954 "base_bdevs_list": [ 00:15:08.954 { 00:15:08.954 "name": "BaseBdev1", 00:15:08.954 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:08.954 "is_configured": true, 00:15:08.954 "data_offset": 2048, 00:15:08.954 "data_size": 63488 00:15:08.954 }, 00:15:08.954 { 00:15:08.954 "name": "BaseBdev2", 00:15:08.954 "uuid": "23eab470-950d-4b31-82cf-82fdfbde3613", 00:15:08.955 "is_configured": true, 00:15:08.955 "data_offset": 2048, 00:15:08.955 "data_size": 63488 00:15:08.955 }, 00:15:08.955 { 00:15:08.955 "name": "BaseBdev3", 00:15:08.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.955 "is_configured": false, 00:15:08.955 "data_offset": 0, 00:15:08.955 "data_size": 0 00:15:08.955 } 00:15:08.955 ] 00:15:08.955 }' 00:15:08.955 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.955 07:20:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:09.522 07:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:09.781 [2024-07-25 07:20:42.102378] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:09.781 [2024-07-25 07:20:42.102527] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15c8680 00:15:09.781 [2024-07-25 07:20:42.102540] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:09.781 [2024-07-25 07:20:42.102699] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15c8350 00:15:09.781 [2024-07-25 07:20:42.102817] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15c8680 00:15:09.781 [2024-07-25 07:20:42.102826] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15c8680 00:15:09.781 [2024-07-25 07:20:42.102908] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:09.781 BaseBdev3 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:09.781 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:10.040 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:10.040 [ 00:15:10.040 { 00:15:10.040 "name": "BaseBdev3", 00:15:10.040 "aliases": [ 00:15:10.040 "9539a93e-41db-4e71-a6e7-1e2b0c6ad580" 00:15:10.040 ], 00:15:10.040 "product_name": "Malloc disk", 00:15:10.040 "block_size": 512, 00:15:10.040 "num_blocks": 65536, 00:15:10.040 "uuid": "9539a93e-41db-4e71-a6e7-1e2b0c6ad580", 00:15:10.040 "assigned_rate_limits": { 00:15:10.040 "rw_ios_per_sec": 0, 00:15:10.040 "rw_mbytes_per_sec": 0, 00:15:10.040 "r_mbytes_per_sec": 0, 00:15:10.040 "w_mbytes_per_sec": 0 00:15:10.040 }, 00:15:10.040 "claimed": true, 00:15:10.040 "claim_type": "exclusive_write", 00:15:10.040 "zoned": false, 00:15:10.040 "supported_io_types": { 00:15:10.040 "read": true, 00:15:10.040 "write": true, 00:15:10.040 "unmap": true, 00:15:10.040 "flush": true, 00:15:10.040 "reset": true, 00:15:10.040 "nvme_admin": false, 00:15:10.040 "nvme_io": false, 00:15:10.040 "nvme_io_md": false, 00:15:10.040 "write_zeroes": true, 00:15:10.040 "zcopy": true, 00:15:10.040 "get_zone_info": false, 00:15:10.040 "zone_management": false, 00:15:10.040 "zone_append": false, 00:15:10.040 "compare": false, 00:15:10.040 "compare_and_write": false, 00:15:10.040 "abort": true, 00:15:10.040 "seek_hole": false, 00:15:10.040 "seek_data": false, 00:15:10.040 "copy": true, 00:15:10.040 "nvme_iov_md": false 00:15:10.040 }, 00:15:10.040 "memory_domains": [ 00:15:10.040 { 00:15:10.040 "dma_device_id": "system", 00:15:10.041 "dma_device_type": 1 00:15:10.041 }, 00:15:10.041 { 00:15:10.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.041 "dma_device_type": 2 00:15:10.041 } 00:15:10.041 ], 00:15:10.041 "driver_specific": {} 00:15:10.041 } 00:15:10.041 ] 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.300 "name": "Existed_Raid", 00:15:10.300 "uuid": "d12b5182-5df2-491a-a2c4-b5512eabaa60", 00:15:10.300 "strip_size_kb": 64, 00:15:10.300 "state": "online", 00:15:10.300 "raid_level": "raid0", 00:15:10.300 "superblock": true, 00:15:10.300 "num_base_bdevs": 3, 00:15:10.300 "num_base_bdevs_discovered": 3, 00:15:10.300 "num_base_bdevs_operational": 3, 00:15:10.300 "base_bdevs_list": [ 00:15:10.300 { 00:15:10.300 "name": "BaseBdev1", 00:15:10.300 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:10.300 "is_configured": true, 00:15:10.300 "data_offset": 2048, 00:15:10.300 "data_size": 63488 00:15:10.300 }, 00:15:10.300 { 00:15:10.300 "name": "BaseBdev2", 00:15:10.300 "uuid": "23eab470-950d-4b31-82cf-82fdfbde3613", 00:15:10.300 "is_configured": true, 00:15:10.300 "data_offset": 2048, 00:15:10.300 "data_size": 63488 00:15:10.300 }, 00:15:10.300 { 00:15:10.300 "name": "BaseBdev3", 00:15:10.300 "uuid": "9539a93e-41db-4e71-a6e7-1e2b0c6ad580", 00:15:10.300 "is_configured": true, 00:15:10.300 "data_offset": 2048, 00:15:10.300 "data_size": 63488 00:15:10.300 } 00:15:10.300 ] 00:15:10.300 }' 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.300 07:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:10.868 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:11.126 [2024-07-25 07:20:43.590575] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.126 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:11.126 "name": "Existed_Raid", 00:15:11.126 "aliases": [ 00:15:11.126 "d12b5182-5df2-491a-a2c4-b5512eabaa60" 00:15:11.126 ], 00:15:11.126 "product_name": "Raid Volume", 00:15:11.126 "block_size": 512, 00:15:11.126 "num_blocks": 190464, 00:15:11.126 "uuid": "d12b5182-5df2-491a-a2c4-b5512eabaa60", 00:15:11.126 "assigned_rate_limits": { 00:15:11.126 "rw_ios_per_sec": 0, 00:15:11.126 "rw_mbytes_per_sec": 0, 00:15:11.126 "r_mbytes_per_sec": 0, 00:15:11.126 "w_mbytes_per_sec": 0 00:15:11.126 }, 00:15:11.126 "claimed": false, 00:15:11.126 "zoned": false, 00:15:11.126 "supported_io_types": { 00:15:11.126 "read": true, 00:15:11.126 "write": true, 00:15:11.126 "unmap": true, 00:15:11.126 "flush": true, 00:15:11.126 "reset": true, 00:15:11.126 "nvme_admin": false, 00:15:11.126 "nvme_io": false, 00:15:11.126 "nvme_io_md": false, 00:15:11.126 "write_zeroes": true, 00:15:11.126 "zcopy": false, 00:15:11.126 "get_zone_info": false, 00:15:11.126 "zone_management": false, 00:15:11.126 "zone_append": false, 00:15:11.126 "compare": false, 00:15:11.126 "compare_and_write": false, 00:15:11.126 "abort": false, 00:15:11.126 "seek_hole": false, 00:15:11.126 "seek_data": false, 00:15:11.126 "copy": false, 00:15:11.126 "nvme_iov_md": false 00:15:11.126 }, 00:15:11.126 "memory_domains": [ 00:15:11.126 { 00:15:11.126 "dma_device_id": "system", 00:15:11.126 "dma_device_type": 1 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.127 "dma_device_type": 2 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "dma_device_id": "system", 00:15:11.127 "dma_device_type": 1 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.127 "dma_device_type": 2 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "dma_device_id": "system", 00:15:11.127 "dma_device_type": 1 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.127 "dma_device_type": 2 00:15:11.127 } 00:15:11.127 ], 00:15:11.127 "driver_specific": { 00:15:11.127 "raid": { 00:15:11.127 "uuid": "d12b5182-5df2-491a-a2c4-b5512eabaa60", 00:15:11.127 "strip_size_kb": 64, 00:15:11.127 "state": "online", 00:15:11.127 "raid_level": "raid0", 00:15:11.127 "superblock": true, 00:15:11.127 "num_base_bdevs": 3, 00:15:11.127 "num_base_bdevs_discovered": 3, 00:15:11.127 "num_base_bdevs_operational": 3, 00:15:11.127 "base_bdevs_list": [ 00:15:11.127 { 00:15:11.127 "name": "BaseBdev1", 00:15:11.127 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:11.127 "is_configured": true, 00:15:11.127 "data_offset": 2048, 00:15:11.127 "data_size": 63488 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "name": "BaseBdev2", 00:15:11.127 "uuid": "23eab470-950d-4b31-82cf-82fdfbde3613", 00:15:11.127 "is_configured": true, 00:15:11.127 "data_offset": 2048, 00:15:11.127 "data_size": 63488 00:15:11.127 }, 00:15:11.127 { 00:15:11.127 "name": "BaseBdev3", 00:15:11.127 "uuid": "9539a93e-41db-4e71-a6e7-1e2b0c6ad580", 00:15:11.127 "is_configured": true, 00:15:11.127 "data_offset": 2048, 00:15:11.127 "data_size": 63488 00:15:11.127 } 00:15:11.127 ] 00:15:11.127 } 00:15:11.127 } 00:15:11.127 }' 00:15:11.127 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:11.127 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:11.127 BaseBdev2 00:15:11.127 BaseBdev3' 00:15:11.127 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.127 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:11.127 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.386 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:11.386 "name": "BaseBdev1", 00:15:11.386 "aliases": [ 00:15:11.386 "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3" 00:15:11.386 ], 00:15:11.386 "product_name": "Malloc disk", 00:15:11.386 "block_size": 512, 00:15:11.386 "num_blocks": 65536, 00:15:11.386 "uuid": "58577d4f-8254-4eaf-82b6-7eb3b53e9dd3", 00:15:11.386 "assigned_rate_limits": { 00:15:11.386 "rw_ios_per_sec": 0, 00:15:11.386 "rw_mbytes_per_sec": 0, 00:15:11.386 "r_mbytes_per_sec": 0, 00:15:11.386 "w_mbytes_per_sec": 0 00:15:11.386 }, 00:15:11.386 "claimed": true, 00:15:11.386 "claim_type": "exclusive_write", 00:15:11.386 "zoned": false, 00:15:11.386 "supported_io_types": { 00:15:11.386 "read": true, 00:15:11.386 "write": true, 00:15:11.386 "unmap": true, 00:15:11.386 "flush": true, 00:15:11.386 "reset": true, 00:15:11.386 "nvme_admin": false, 00:15:11.386 "nvme_io": false, 00:15:11.386 "nvme_io_md": false, 00:15:11.386 "write_zeroes": true, 00:15:11.386 "zcopy": true, 00:15:11.386 "get_zone_info": false, 00:15:11.386 "zone_management": false, 00:15:11.386 "zone_append": false, 00:15:11.386 "compare": false, 00:15:11.386 "compare_and_write": false, 00:15:11.386 "abort": true, 00:15:11.386 "seek_hole": false, 00:15:11.386 "seek_data": false, 00:15:11.386 "copy": true, 00:15:11.386 "nvme_iov_md": false 00:15:11.386 }, 00:15:11.386 "memory_domains": [ 00:15:11.386 { 00:15:11.386 "dma_device_id": "system", 00:15:11.386 "dma_device_type": 1 00:15:11.386 }, 00:15:11.386 { 00:15:11.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.386 "dma_device_type": 2 00:15:11.386 } 00:15:11.386 ], 00:15:11.386 "driver_specific": {} 00:15:11.386 }' 00:15:11.386 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.645 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.645 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:11.645 07:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.645 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.904 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:11.904 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.904 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:11.904 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.162 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.162 "name": "BaseBdev2", 00:15:12.162 "aliases": [ 00:15:12.162 "23eab470-950d-4b31-82cf-82fdfbde3613" 00:15:12.162 ], 00:15:12.162 "product_name": "Malloc disk", 00:15:12.162 "block_size": 512, 00:15:12.162 "num_blocks": 65536, 00:15:12.162 "uuid": "23eab470-950d-4b31-82cf-82fdfbde3613", 00:15:12.162 "assigned_rate_limits": { 00:15:12.162 "rw_ios_per_sec": 0, 00:15:12.162 "rw_mbytes_per_sec": 0, 00:15:12.163 "r_mbytes_per_sec": 0, 00:15:12.163 "w_mbytes_per_sec": 0 00:15:12.163 }, 00:15:12.163 "claimed": true, 00:15:12.163 "claim_type": "exclusive_write", 00:15:12.163 "zoned": false, 00:15:12.163 "supported_io_types": { 00:15:12.163 "read": true, 00:15:12.163 "write": true, 00:15:12.163 "unmap": true, 00:15:12.163 "flush": true, 00:15:12.163 "reset": true, 00:15:12.163 "nvme_admin": false, 00:15:12.163 "nvme_io": false, 00:15:12.163 "nvme_io_md": false, 00:15:12.163 "write_zeroes": true, 00:15:12.163 "zcopy": true, 00:15:12.163 "get_zone_info": false, 00:15:12.163 "zone_management": false, 00:15:12.163 "zone_append": false, 00:15:12.163 "compare": false, 00:15:12.163 "compare_and_write": false, 00:15:12.163 "abort": true, 00:15:12.163 "seek_hole": false, 00:15:12.163 "seek_data": false, 00:15:12.163 "copy": true, 00:15:12.163 "nvme_iov_md": false 00:15:12.163 }, 00:15:12.163 "memory_domains": [ 00:15:12.163 { 00:15:12.163 "dma_device_id": "system", 00:15:12.163 "dma_device_type": 1 00:15:12.163 }, 00:15:12.163 { 00:15:12.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.163 "dma_device_type": 2 00:15:12.163 } 00:15:12.163 ], 00:15:12.163 "driver_specific": {} 00:15:12.163 }' 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.163 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.421 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.421 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.421 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.421 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.421 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.422 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:12.422 07:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.681 "name": "BaseBdev3", 00:15:12.681 "aliases": [ 00:15:12.681 "9539a93e-41db-4e71-a6e7-1e2b0c6ad580" 00:15:12.681 ], 00:15:12.681 "product_name": "Malloc disk", 00:15:12.681 "block_size": 512, 00:15:12.681 "num_blocks": 65536, 00:15:12.681 "uuid": "9539a93e-41db-4e71-a6e7-1e2b0c6ad580", 00:15:12.681 "assigned_rate_limits": { 00:15:12.681 "rw_ios_per_sec": 0, 00:15:12.681 "rw_mbytes_per_sec": 0, 00:15:12.681 "r_mbytes_per_sec": 0, 00:15:12.681 "w_mbytes_per_sec": 0 00:15:12.681 }, 00:15:12.681 "claimed": true, 00:15:12.681 "claim_type": "exclusive_write", 00:15:12.681 "zoned": false, 00:15:12.681 "supported_io_types": { 00:15:12.681 "read": true, 00:15:12.681 "write": true, 00:15:12.681 "unmap": true, 00:15:12.681 "flush": true, 00:15:12.681 "reset": true, 00:15:12.681 "nvme_admin": false, 00:15:12.681 "nvme_io": false, 00:15:12.681 "nvme_io_md": false, 00:15:12.681 "write_zeroes": true, 00:15:12.681 "zcopy": true, 00:15:12.681 "get_zone_info": false, 00:15:12.681 "zone_management": false, 00:15:12.681 "zone_append": false, 00:15:12.681 "compare": false, 00:15:12.681 "compare_and_write": false, 00:15:12.681 "abort": true, 00:15:12.681 "seek_hole": false, 00:15:12.681 "seek_data": false, 00:15:12.681 "copy": true, 00:15:12.681 "nvme_iov_md": false 00:15:12.681 }, 00:15:12.681 "memory_domains": [ 00:15:12.681 { 00:15:12.681 "dma_device_id": "system", 00:15:12.681 "dma_device_type": 1 00:15:12.681 }, 00:15:12.681 { 00:15:12.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.681 "dma_device_type": 2 00:15:12.681 } 00:15:12.681 ], 00:15:12.681 "driver_specific": {} 00:15:12.681 }' 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.681 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.939 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.939 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.939 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.939 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.939 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.939 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:13.199 [2024-07-25 07:20:45.531439] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:13.199 [2024-07-25 07:20:45.531466] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:13.199 [2024-07-25 07:20:45.531502] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.199 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.458 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.458 "name": "Existed_Raid", 00:15:13.458 "uuid": "d12b5182-5df2-491a-a2c4-b5512eabaa60", 00:15:13.458 "strip_size_kb": 64, 00:15:13.458 "state": "offline", 00:15:13.458 "raid_level": "raid0", 00:15:13.458 "superblock": true, 00:15:13.458 "num_base_bdevs": 3, 00:15:13.458 "num_base_bdevs_discovered": 2, 00:15:13.458 "num_base_bdevs_operational": 2, 00:15:13.458 "base_bdevs_list": [ 00:15:13.458 { 00:15:13.458 "name": null, 00:15:13.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.458 "is_configured": false, 00:15:13.458 "data_offset": 2048, 00:15:13.458 "data_size": 63488 00:15:13.458 }, 00:15:13.458 { 00:15:13.458 "name": "BaseBdev2", 00:15:13.458 "uuid": "23eab470-950d-4b31-82cf-82fdfbde3613", 00:15:13.458 "is_configured": true, 00:15:13.458 "data_offset": 2048, 00:15:13.458 "data_size": 63488 00:15:13.458 }, 00:15:13.458 { 00:15:13.458 "name": "BaseBdev3", 00:15:13.458 "uuid": "9539a93e-41db-4e71-a6e7-1e2b0c6ad580", 00:15:13.458 "is_configured": true, 00:15:13.458 "data_offset": 2048, 00:15:13.458 "data_size": 63488 00:15:13.458 } 00:15:13.458 ] 00:15:13.458 }' 00:15:13.458 07:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.458 07:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:14.026 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:14.286 [2024-07-25 07:20:46.747583] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:14.286 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:14.286 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:14.286 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.286 07:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:14.545 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:14.545 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:14.545 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:14.804 [2024-07-25 07:20:47.210778] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:14.804 [2024-07-25 07:20:47.210814] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15c8680 name Existed_Raid, state offline 00:15:14.804 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:14.804 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:14.804 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.804 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:15.062 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:15.062 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:15.063 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:15.063 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:15.063 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:15.063 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:15.322 BaseBdev2 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:15.322 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.580 07:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:15.839 [ 00:15:15.839 { 00:15:15.839 "name": "BaseBdev2", 00:15:15.839 "aliases": [ 00:15:15.839 "27e59f96-1788-4612-83d6-20e8869db0aa" 00:15:15.839 ], 00:15:15.839 "product_name": "Malloc disk", 00:15:15.839 "block_size": 512, 00:15:15.839 "num_blocks": 65536, 00:15:15.839 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:15.839 "assigned_rate_limits": { 00:15:15.839 "rw_ios_per_sec": 0, 00:15:15.839 "rw_mbytes_per_sec": 0, 00:15:15.839 "r_mbytes_per_sec": 0, 00:15:15.839 "w_mbytes_per_sec": 0 00:15:15.839 }, 00:15:15.839 "claimed": false, 00:15:15.839 "zoned": false, 00:15:15.839 "supported_io_types": { 00:15:15.839 "read": true, 00:15:15.839 "write": true, 00:15:15.839 "unmap": true, 00:15:15.839 "flush": true, 00:15:15.839 "reset": true, 00:15:15.839 "nvme_admin": false, 00:15:15.839 "nvme_io": false, 00:15:15.839 "nvme_io_md": false, 00:15:15.839 "write_zeroes": true, 00:15:15.839 "zcopy": true, 00:15:15.839 "get_zone_info": false, 00:15:15.839 "zone_management": false, 00:15:15.839 "zone_append": false, 00:15:15.839 "compare": false, 00:15:15.839 "compare_and_write": false, 00:15:15.839 "abort": true, 00:15:15.839 "seek_hole": false, 00:15:15.839 "seek_data": false, 00:15:15.839 "copy": true, 00:15:15.839 "nvme_iov_md": false 00:15:15.839 }, 00:15:15.839 "memory_domains": [ 00:15:15.839 { 00:15:15.839 "dma_device_id": "system", 00:15:15.839 "dma_device_type": 1 00:15:15.839 }, 00:15:15.839 { 00:15:15.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.839 "dma_device_type": 2 00:15:15.839 } 00:15:15.839 ], 00:15:15.839 "driver_specific": {} 00:15:15.839 } 00:15:15.839 ] 00:15:15.839 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:15.839 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:15.839 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:15.839 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:15.839 BaseBdev3 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.098 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:16.358 [ 00:15:16.358 { 00:15:16.358 "name": "BaseBdev3", 00:15:16.358 "aliases": [ 00:15:16.358 "552ea81e-e9b8-4258-be6e-1e961285fbc0" 00:15:16.358 ], 00:15:16.358 "product_name": "Malloc disk", 00:15:16.358 "block_size": 512, 00:15:16.358 "num_blocks": 65536, 00:15:16.358 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:16.358 "assigned_rate_limits": { 00:15:16.358 "rw_ios_per_sec": 0, 00:15:16.358 "rw_mbytes_per_sec": 0, 00:15:16.358 "r_mbytes_per_sec": 0, 00:15:16.358 "w_mbytes_per_sec": 0 00:15:16.358 }, 00:15:16.358 "claimed": false, 00:15:16.358 "zoned": false, 00:15:16.358 "supported_io_types": { 00:15:16.358 "read": true, 00:15:16.358 "write": true, 00:15:16.358 "unmap": true, 00:15:16.358 "flush": true, 00:15:16.358 "reset": true, 00:15:16.358 "nvme_admin": false, 00:15:16.358 "nvme_io": false, 00:15:16.358 "nvme_io_md": false, 00:15:16.358 "write_zeroes": true, 00:15:16.358 "zcopy": true, 00:15:16.358 "get_zone_info": false, 00:15:16.358 "zone_management": false, 00:15:16.358 "zone_append": false, 00:15:16.358 "compare": false, 00:15:16.358 "compare_and_write": false, 00:15:16.358 "abort": true, 00:15:16.358 "seek_hole": false, 00:15:16.358 "seek_data": false, 00:15:16.358 "copy": true, 00:15:16.358 "nvme_iov_md": false 00:15:16.358 }, 00:15:16.358 "memory_domains": [ 00:15:16.358 { 00:15:16.358 "dma_device_id": "system", 00:15:16.358 "dma_device_type": 1 00:15:16.358 }, 00:15:16.358 { 00:15:16.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.358 "dma_device_type": 2 00:15:16.358 } 00:15:16.358 ], 00:15:16.358 "driver_specific": {} 00:15:16.358 } 00:15:16.358 ] 00:15:16.358 07:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:16.358 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:16.358 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:16.358 07:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:16.617 [2024-07-25 07:20:49.025861] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:16.617 [2024-07-25 07:20:49.025897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:16.617 [2024-07-25 07:20:49.025915] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:16.617 [2024-07-25 07:20:49.027146] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.617 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.876 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.876 "name": "Existed_Raid", 00:15:16.876 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:16.876 "strip_size_kb": 64, 00:15:16.876 "state": "configuring", 00:15:16.876 "raid_level": "raid0", 00:15:16.876 "superblock": true, 00:15:16.876 "num_base_bdevs": 3, 00:15:16.876 "num_base_bdevs_discovered": 2, 00:15:16.876 "num_base_bdevs_operational": 3, 00:15:16.876 "base_bdevs_list": [ 00:15:16.876 { 00:15:16.876 "name": "BaseBdev1", 00:15:16.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.876 "is_configured": false, 00:15:16.876 "data_offset": 0, 00:15:16.876 "data_size": 0 00:15:16.876 }, 00:15:16.876 { 00:15:16.876 "name": "BaseBdev2", 00:15:16.876 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:16.876 "is_configured": true, 00:15:16.876 "data_offset": 2048, 00:15:16.876 "data_size": 63488 00:15:16.876 }, 00:15:16.876 { 00:15:16.876 "name": "BaseBdev3", 00:15:16.876 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:16.876 "is_configured": true, 00:15:16.876 "data_offset": 2048, 00:15:16.876 "data_size": 63488 00:15:16.876 } 00:15:16.876 ] 00:15:16.876 }' 00:15:16.876 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.876 07:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.443 07:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:17.702 [2024-07-25 07:20:50.036484] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.703 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.962 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.962 "name": "Existed_Raid", 00:15:17.962 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:17.962 "strip_size_kb": 64, 00:15:17.962 "state": "configuring", 00:15:17.962 "raid_level": "raid0", 00:15:17.962 "superblock": true, 00:15:17.962 "num_base_bdevs": 3, 00:15:17.962 "num_base_bdevs_discovered": 1, 00:15:17.962 "num_base_bdevs_operational": 3, 00:15:17.962 "base_bdevs_list": [ 00:15:17.962 { 00:15:17.962 "name": "BaseBdev1", 00:15:17.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.962 "is_configured": false, 00:15:17.962 "data_offset": 0, 00:15:17.962 "data_size": 0 00:15:17.962 }, 00:15:17.962 { 00:15:17.962 "name": null, 00:15:17.962 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:17.962 "is_configured": false, 00:15:17.962 "data_offset": 2048, 00:15:17.962 "data_size": 63488 00:15:17.962 }, 00:15:17.962 { 00:15:17.962 "name": "BaseBdev3", 00:15:17.962 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:17.962 "is_configured": true, 00:15:17.962 "data_offset": 2048, 00:15:17.962 "data_size": 63488 00:15:17.962 } 00:15:17.962 ] 00:15:17.962 }' 00:15:17.962 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.962 07:20:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.528 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.528 07:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:18.528 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:18.528 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:18.787 [2024-07-25 07:20:51.226971] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:18.787 BaseBdev1 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:18.787 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.046 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:19.306 [ 00:15:19.306 { 00:15:19.306 "name": "BaseBdev1", 00:15:19.306 "aliases": [ 00:15:19.306 "7b39c04d-9988-4771-abac-d849713cb95c" 00:15:19.306 ], 00:15:19.306 "product_name": "Malloc disk", 00:15:19.306 "block_size": 512, 00:15:19.306 "num_blocks": 65536, 00:15:19.306 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:19.306 "assigned_rate_limits": { 00:15:19.306 "rw_ios_per_sec": 0, 00:15:19.306 "rw_mbytes_per_sec": 0, 00:15:19.306 "r_mbytes_per_sec": 0, 00:15:19.306 "w_mbytes_per_sec": 0 00:15:19.306 }, 00:15:19.306 "claimed": true, 00:15:19.306 "claim_type": "exclusive_write", 00:15:19.306 "zoned": false, 00:15:19.306 "supported_io_types": { 00:15:19.306 "read": true, 00:15:19.306 "write": true, 00:15:19.306 "unmap": true, 00:15:19.306 "flush": true, 00:15:19.306 "reset": true, 00:15:19.306 "nvme_admin": false, 00:15:19.306 "nvme_io": false, 00:15:19.306 "nvme_io_md": false, 00:15:19.306 "write_zeroes": true, 00:15:19.306 "zcopy": true, 00:15:19.306 "get_zone_info": false, 00:15:19.306 "zone_management": false, 00:15:19.306 "zone_append": false, 00:15:19.306 "compare": false, 00:15:19.306 "compare_and_write": false, 00:15:19.306 "abort": true, 00:15:19.306 "seek_hole": false, 00:15:19.306 "seek_data": false, 00:15:19.306 "copy": true, 00:15:19.306 "nvme_iov_md": false 00:15:19.306 }, 00:15:19.306 "memory_domains": [ 00:15:19.306 { 00:15:19.306 "dma_device_id": "system", 00:15:19.306 "dma_device_type": 1 00:15:19.306 }, 00:15:19.306 { 00:15:19.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.306 "dma_device_type": 2 00:15:19.306 } 00:15:19.306 ], 00:15:19.306 "driver_specific": {} 00:15:19.306 } 00:15:19.306 ] 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.306 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.565 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.565 "name": "Existed_Raid", 00:15:19.565 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:19.565 "strip_size_kb": 64, 00:15:19.565 "state": "configuring", 00:15:19.565 "raid_level": "raid0", 00:15:19.565 "superblock": true, 00:15:19.565 "num_base_bdevs": 3, 00:15:19.565 "num_base_bdevs_discovered": 2, 00:15:19.565 "num_base_bdevs_operational": 3, 00:15:19.565 "base_bdevs_list": [ 00:15:19.565 { 00:15:19.565 "name": "BaseBdev1", 00:15:19.565 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:19.565 "is_configured": true, 00:15:19.565 "data_offset": 2048, 00:15:19.565 "data_size": 63488 00:15:19.565 }, 00:15:19.565 { 00:15:19.565 "name": null, 00:15:19.565 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:19.565 "is_configured": false, 00:15:19.565 "data_offset": 2048, 00:15:19.565 "data_size": 63488 00:15:19.565 }, 00:15:19.565 { 00:15:19.565 "name": "BaseBdev3", 00:15:19.565 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:19.565 "is_configured": true, 00:15:19.565 "data_offset": 2048, 00:15:19.565 "data_size": 63488 00:15:19.565 } 00:15:19.565 ] 00:15:19.565 }' 00:15:19.565 07:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.565 07:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:20.132 07:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.132 07:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:20.390 07:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:20.390 07:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:20.961 [2024-07-25 07:20:53.196193] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.961 "name": "Existed_Raid", 00:15:20.961 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:20.961 "strip_size_kb": 64, 00:15:20.961 "state": "configuring", 00:15:20.961 "raid_level": "raid0", 00:15:20.961 "superblock": true, 00:15:20.961 "num_base_bdevs": 3, 00:15:20.961 "num_base_bdevs_discovered": 1, 00:15:20.961 "num_base_bdevs_operational": 3, 00:15:20.961 "base_bdevs_list": [ 00:15:20.961 { 00:15:20.961 "name": "BaseBdev1", 00:15:20.961 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:20.961 "is_configured": true, 00:15:20.961 "data_offset": 2048, 00:15:20.961 "data_size": 63488 00:15:20.961 }, 00:15:20.961 { 00:15:20.961 "name": null, 00:15:20.961 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:20.961 "is_configured": false, 00:15:20.961 "data_offset": 2048, 00:15:20.961 "data_size": 63488 00:15:20.961 }, 00:15:20.961 { 00:15:20.961 "name": null, 00:15:20.961 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:20.961 "is_configured": false, 00:15:20.961 "data_offset": 2048, 00:15:20.961 "data_size": 63488 00:15:20.961 } 00:15:20.961 ] 00:15:20.961 }' 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.961 07:20:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.567 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.567 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:21.826 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:21.826 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:22.084 [2024-07-25 07:20:54.463684] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:22.084 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.085 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.343 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.343 "name": "Existed_Raid", 00:15:22.343 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:22.343 "strip_size_kb": 64, 00:15:22.343 "state": "configuring", 00:15:22.343 "raid_level": "raid0", 00:15:22.343 "superblock": true, 00:15:22.343 "num_base_bdevs": 3, 00:15:22.343 "num_base_bdevs_discovered": 2, 00:15:22.343 "num_base_bdevs_operational": 3, 00:15:22.343 "base_bdevs_list": [ 00:15:22.343 { 00:15:22.343 "name": "BaseBdev1", 00:15:22.343 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:22.343 "is_configured": true, 00:15:22.343 "data_offset": 2048, 00:15:22.343 "data_size": 63488 00:15:22.343 }, 00:15:22.343 { 00:15:22.343 "name": null, 00:15:22.343 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:22.343 "is_configured": false, 00:15:22.343 "data_offset": 2048, 00:15:22.343 "data_size": 63488 00:15:22.343 }, 00:15:22.343 { 00:15:22.343 "name": "BaseBdev3", 00:15:22.343 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:22.343 "is_configured": true, 00:15:22.343 "data_offset": 2048, 00:15:22.343 "data_size": 63488 00:15:22.343 } 00:15:22.343 ] 00:15:22.343 }' 00:15:22.343 07:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.343 07:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.910 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.910 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:23.168 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:23.169 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:23.169 [2024-07-25 07:20:55.686934] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.427 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.427 "name": "Existed_Raid", 00:15:23.427 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:23.427 "strip_size_kb": 64, 00:15:23.427 "state": "configuring", 00:15:23.427 "raid_level": "raid0", 00:15:23.427 "superblock": true, 00:15:23.427 "num_base_bdevs": 3, 00:15:23.427 "num_base_bdevs_discovered": 1, 00:15:23.427 "num_base_bdevs_operational": 3, 00:15:23.427 "base_bdevs_list": [ 00:15:23.427 { 00:15:23.427 "name": null, 00:15:23.427 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:23.427 "is_configured": false, 00:15:23.427 "data_offset": 2048, 00:15:23.427 "data_size": 63488 00:15:23.427 }, 00:15:23.427 { 00:15:23.427 "name": null, 00:15:23.428 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:23.428 "is_configured": false, 00:15:23.428 "data_offset": 2048, 00:15:23.428 "data_size": 63488 00:15:23.428 }, 00:15:23.428 { 00:15:23.428 "name": "BaseBdev3", 00:15:23.428 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:23.428 "is_configured": true, 00:15:23.428 "data_offset": 2048, 00:15:23.428 "data_size": 63488 00:15:23.428 } 00:15:23.428 ] 00:15:23.428 }' 00:15:23.428 07:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.428 07:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.997 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.997 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:24.256 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:24.256 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:24.516 [2024-07-25 07:20:56.916198] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.516 07:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.775 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.775 "name": "Existed_Raid", 00:15:24.775 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:24.775 "strip_size_kb": 64, 00:15:24.775 "state": "configuring", 00:15:24.775 "raid_level": "raid0", 00:15:24.775 "superblock": true, 00:15:24.775 "num_base_bdevs": 3, 00:15:24.775 "num_base_bdevs_discovered": 2, 00:15:24.775 "num_base_bdevs_operational": 3, 00:15:24.775 "base_bdevs_list": [ 00:15:24.775 { 00:15:24.775 "name": null, 00:15:24.775 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:24.775 "is_configured": false, 00:15:24.775 "data_offset": 2048, 00:15:24.775 "data_size": 63488 00:15:24.775 }, 00:15:24.775 { 00:15:24.775 "name": "BaseBdev2", 00:15:24.775 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:24.775 "is_configured": true, 00:15:24.775 "data_offset": 2048, 00:15:24.775 "data_size": 63488 00:15:24.775 }, 00:15:24.775 { 00:15:24.775 "name": "BaseBdev3", 00:15:24.775 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:24.775 "is_configured": true, 00:15:24.775 "data_offset": 2048, 00:15:24.775 "data_size": 63488 00:15:24.775 } 00:15:24.775 ] 00:15:24.775 }' 00:15:24.775 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.775 07:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.342 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:25.342 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.601 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:25.601 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.601 07:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:25.860 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7b39c04d-9988-4771-abac-d849713cb95c 00:15:26.119 [2024-07-25 07:20:58.411257] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:26.119 [2024-07-25 07:20:58.411398] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1776000 00:15:26.119 [2024-07-25 07:20:58.411411] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:26.119 [2024-07-25 07:20:58.411568] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15c7e90 00:15:26.119 [2024-07-25 07:20:58.411672] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1776000 00:15:26.119 [2024-07-25 07:20:58.411681] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1776000 00:15:26.119 [2024-07-25 07:20:58.411765] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:26.119 NewBaseBdev 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:26.119 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:26.378 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:26.378 [ 00:15:26.378 { 00:15:26.378 "name": "NewBaseBdev", 00:15:26.378 "aliases": [ 00:15:26.378 "7b39c04d-9988-4771-abac-d849713cb95c" 00:15:26.378 ], 00:15:26.378 "product_name": "Malloc disk", 00:15:26.378 "block_size": 512, 00:15:26.378 "num_blocks": 65536, 00:15:26.378 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:26.378 "assigned_rate_limits": { 00:15:26.378 "rw_ios_per_sec": 0, 00:15:26.378 "rw_mbytes_per_sec": 0, 00:15:26.378 "r_mbytes_per_sec": 0, 00:15:26.378 "w_mbytes_per_sec": 0 00:15:26.378 }, 00:15:26.378 "claimed": true, 00:15:26.378 "claim_type": "exclusive_write", 00:15:26.378 "zoned": false, 00:15:26.378 "supported_io_types": { 00:15:26.378 "read": true, 00:15:26.378 "write": true, 00:15:26.378 "unmap": true, 00:15:26.378 "flush": true, 00:15:26.378 "reset": true, 00:15:26.379 "nvme_admin": false, 00:15:26.379 "nvme_io": false, 00:15:26.379 "nvme_io_md": false, 00:15:26.379 "write_zeroes": true, 00:15:26.379 "zcopy": true, 00:15:26.379 "get_zone_info": false, 00:15:26.379 "zone_management": false, 00:15:26.379 "zone_append": false, 00:15:26.379 "compare": false, 00:15:26.379 "compare_and_write": false, 00:15:26.379 "abort": true, 00:15:26.379 "seek_hole": false, 00:15:26.379 "seek_data": false, 00:15:26.379 "copy": true, 00:15:26.379 "nvme_iov_md": false 00:15:26.379 }, 00:15:26.379 "memory_domains": [ 00:15:26.379 { 00:15:26.379 "dma_device_id": "system", 00:15:26.379 "dma_device_type": 1 00:15:26.379 }, 00:15:26.379 { 00:15:26.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.379 "dma_device_type": 2 00:15:26.379 } 00:15:26.379 ], 00:15:26.379 "driver_specific": {} 00:15:26.379 } 00:15:26.379 ] 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.379 07:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.638 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.638 "name": "Existed_Raid", 00:15:26.638 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:26.638 "strip_size_kb": 64, 00:15:26.638 "state": "online", 00:15:26.638 "raid_level": "raid0", 00:15:26.638 "superblock": true, 00:15:26.638 "num_base_bdevs": 3, 00:15:26.638 "num_base_bdevs_discovered": 3, 00:15:26.638 "num_base_bdevs_operational": 3, 00:15:26.638 "base_bdevs_list": [ 00:15:26.638 { 00:15:26.638 "name": "NewBaseBdev", 00:15:26.638 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:26.638 "is_configured": true, 00:15:26.638 "data_offset": 2048, 00:15:26.638 "data_size": 63488 00:15:26.638 }, 00:15:26.638 { 00:15:26.638 "name": "BaseBdev2", 00:15:26.638 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:26.638 "is_configured": true, 00:15:26.638 "data_offset": 2048, 00:15:26.638 "data_size": 63488 00:15:26.638 }, 00:15:26.638 { 00:15:26.638 "name": "BaseBdev3", 00:15:26.638 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:26.638 "is_configured": true, 00:15:26.638 "data_offset": 2048, 00:15:26.638 "data_size": 63488 00:15:26.638 } 00:15:26.638 ] 00:15:26.638 }' 00:15:26.638 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.638 07:20:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:27.206 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:27.465 [2024-07-25 07:20:59.875634] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:27.465 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:27.465 "name": "Existed_Raid", 00:15:27.465 "aliases": [ 00:15:27.465 "fcb50376-647f-47de-b26b-bc54e7f1a9aa" 00:15:27.465 ], 00:15:27.465 "product_name": "Raid Volume", 00:15:27.465 "block_size": 512, 00:15:27.465 "num_blocks": 190464, 00:15:27.465 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:27.465 "assigned_rate_limits": { 00:15:27.465 "rw_ios_per_sec": 0, 00:15:27.465 "rw_mbytes_per_sec": 0, 00:15:27.465 "r_mbytes_per_sec": 0, 00:15:27.465 "w_mbytes_per_sec": 0 00:15:27.465 }, 00:15:27.465 "claimed": false, 00:15:27.465 "zoned": false, 00:15:27.465 "supported_io_types": { 00:15:27.465 "read": true, 00:15:27.465 "write": true, 00:15:27.465 "unmap": true, 00:15:27.465 "flush": true, 00:15:27.465 "reset": true, 00:15:27.465 "nvme_admin": false, 00:15:27.465 "nvme_io": false, 00:15:27.465 "nvme_io_md": false, 00:15:27.465 "write_zeroes": true, 00:15:27.465 "zcopy": false, 00:15:27.465 "get_zone_info": false, 00:15:27.465 "zone_management": false, 00:15:27.465 "zone_append": false, 00:15:27.465 "compare": false, 00:15:27.465 "compare_and_write": false, 00:15:27.465 "abort": false, 00:15:27.465 "seek_hole": false, 00:15:27.465 "seek_data": false, 00:15:27.465 "copy": false, 00:15:27.465 "nvme_iov_md": false 00:15:27.466 }, 00:15:27.466 "memory_domains": [ 00:15:27.466 { 00:15:27.466 "dma_device_id": "system", 00:15:27.466 "dma_device_type": 1 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.466 "dma_device_type": 2 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "dma_device_id": "system", 00:15:27.466 "dma_device_type": 1 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.466 "dma_device_type": 2 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "dma_device_id": "system", 00:15:27.466 "dma_device_type": 1 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.466 "dma_device_type": 2 00:15:27.466 } 00:15:27.466 ], 00:15:27.466 "driver_specific": { 00:15:27.466 "raid": { 00:15:27.466 "uuid": "fcb50376-647f-47de-b26b-bc54e7f1a9aa", 00:15:27.466 "strip_size_kb": 64, 00:15:27.466 "state": "online", 00:15:27.466 "raid_level": "raid0", 00:15:27.466 "superblock": true, 00:15:27.466 "num_base_bdevs": 3, 00:15:27.466 "num_base_bdevs_discovered": 3, 00:15:27.466 "num_base_bdevs_operational": 3, 00:15:27.466 "base_bdevs_list": [ 00:15:27.466 { 00:15:27.466 "name": "NewBaseBdev", 00:15:27.466 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:27.466 "is_configured": true, 00:15:27.466 "data_offset": 2048, 00:15:27.466 "data_size": 63488 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "name": "BaseBdev2", 00:15:27.466 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:27.466 "is_configured": true, 00:15:27.466 "data_offset": 2048, 00:15:27.466 "data_size": 63488 00:15:27.466 }, 00:15:27.466 { 00:15:27.466 "name": "BaseBdev3", 00:15:27.466 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:27.466 "is_configured": true, 00:15:27.466 "data_offset": 2048, 00:15:27.466 "data_size": 63488 00:15:27.466 } 00:15:27.466 ] 00:15:27.466 } 00:15:27.466 } 00:15:27.466 }' 00:15:27.466 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:27.466 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:27.466 BaseBdev2 00:15:27.466 BaseBdev3' 00:15:27.466 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.466 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:27.466 07:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.725 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.725 "name": "NewBaseBdev", 00:15:27.725 "aliases": [ 00:15:27.725 "7b39c04d-9988-4771-abac-d849713cb95c" 00:15:27.725 ], 00:15:27.725 "product_name": "Malloc disk", 00:15:27.725 "block_size": 512, 00:15:27.725 "num_blocks": 65536, 00:15:27.725 "uuid": "7b39c04d-9988-4771-abac-d849713cb95c", 00:15:27.725 "assigned_rate_limits": { 00:15:27.725 "rw_ios_per_sec": 0, 00:15:27.725 "rw_mbytes_per_sec": 0, 00:15:27.725 "r_mbytes_per_sec": 0, 00:15:27.725 "w_mbytes_per_sec": 0 00:15:27.725 }, 00:15:27.725 "claimed": true, 00:15:27.725 "claim_type": "exclusive_write", 00:15:27.725 "zoned": false, 00:15:27.725 "supported_io_types": { 00:15:27.725 "read": true, 00:15:27.725 "write": true, 00:15:27.725 "unmap": true, 00:15:27.725 "flush": true, 00:15:27.725 "reset": true, 00:15:27.725 "nvme_admin": false, 00:15:27.725 "nvme_io": false, 00:15:27.725 "nvme_io_md": false, 00:15:27.725 "write_zeroes": true, 00:15:27.725 "zcopy": true, 00:15:27.725 "get_zone_info": false, 00:15:27.725 "zone_management": false, 00:15:27.725 "zone_append": false, 00:15:27.725 "compare": false, 00:15:27.725 "compare_and_write": false, 00:15:27.725 "abort": true, 00:15:27.725 "seek_hole": false, 00:15:27.725 "seek_data": false, 00:15:27.725 "copy": true, 00:15:27.725 "nvme_iov_md": false 00:15:27.725 }, 00:15:27.725 "memory_domains": [ 00:15:27.725 { 00:15:27.725 "dma_device_id": "system", 00:15:27.725 "dma_device_type": 1 00:15:27.725 }, 00:15:27.725 { 00:15:27.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.725 "dma_device_type": 2 00:15:27.725 } 00:15:27.725 ], 00:15:27.725 "driver_specific": {} 00:15:27.725 }' 00:15:27.725 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.725 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.725 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.725 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:27.984 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:28.244 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:28.244 "name": "BaseBdev2", 00:15:28.244 "aliases": [ 00:15:28.244 "27e59f96-1788-4612-83d6-20e8869db0aa" 00:15:28.244 ], 00:15:28.244 "product_name": "Malloc disk", 00:15:28.244 "block_size": 512, 00:15:28.244 "num_blocks": 65536, 00:15:28.244 "uuid": "27e59f96-1788-4612-83d6-20e8869db0aa", 00:15:28.244 "assigned_rate_limits": { 00:15:28.244 "rw_ios_per_sec": 0, 00:15:28.244 "rw_mbytes_per_sec": 0, 00:15:28.244 "r_mbytes_per_sec": 0, 00:15:28.244 "w_mbytes_per_sec": 0 00:15:28.244 }, 00:15:28.244 "claimed": true, 00:15:28.244 "claim_type": "exclusive_write", 00:15:28.244 "zoned": false, 00:15:28.244 "supported_io_types": { 00:15:28.244 "read": true, 00:15:28.244 "write": true, 00:15:28.244 "unmap": true, 00:15:28.244 "flush": true, 00:15:28.244 "reset": true, 00:15:28.244 "nvme_admin": false, 00:15:28.244 "nvme_io": false, 00:15:28.244 "nvme_io_md": false, 00:15:28.244 "write_zeroes": true, 00:15:28.244 "zcopy": true, 00:15:28.244 "get_zone_info": false, 00:15:28.244 "zone_management": false, 00:15:28.244 "zone_append": false, 00:15:28.244 "compare": false, 00:15:28.244 "compare_and_write": false, 00:15:28.244 "abort": true, 00:15:28.244 "seek_hole": false, 00:15:28.244 "seek_data": false, 00:15:28.244 "copy": true, 00:15:28.244 "nvme_iov_md": false 00:15:28.244 }, 00:15:28.244 "memory_domains": [ 00:15:28.244 { 00:15:28.244 "dma_device_id": "system", 00:15:28.244 "dma_device_type": 1 00:15:28.244 }, 00:15:28.244 { 00:15:28.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.244 "dma_device_type": 2 00:15:28.244 } 00:15:28.244 ], 00:15:28.244 "driver_specific": {} 00:15:28.244 }' 00:15:28.244 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.244 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:28.503 07:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.762 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.762 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:28.762 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:28.762 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:28.762 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:29.021 "name": "BaseBdev3", 00:15:29.021 "aliases": [ 00:15:29.021 "552ea81e-e9b8-4258-be6e-1e961285fbc0" 00:15:29.021 ], 00:15:29.021 "product_name": "Malloc disk", 00:15:29.021 "block_size": 512, 00:15:29.021 "num_blocks": 65536, 00:15:29.021 "uuid": "552ea81e-e9b8-4258-be6e-1e961285fbc0", 00:15:29.021 "assigned_rate_limits": { 00:15:29.021 "rw_ios_per_sec": 0, 00:15:29.021 "rw_mbytes_per_sec": 0, 00:15:29.021 "r_mbytes_per_sec": 0, 00:15:29.021 "w_mbytes_per_sec": 0 00:15:29.021 }, 00:15:29.021 "claimed": true, 00:15:29.021 "claim_type": "exclusive_write", 00:15:29.021 "zoned": false, 00:15:29.021 "supported_io_types": { 00:15:29.021 "read": true, 00:15:29.021 "write": true, 00:15:29.021 "unmap": true, 00:15:29.021 "flush": true, 00:15:29.021 "reset": true, 00:15:29.021 "nvme_admin": false, 00:15:29.021 "nvme_io": false, 00:15:29.021 "nvme_io_md": false, 00:15:29.021 "write_zeroes": true, 00:15:29.021 "zcopy": true, 00:15:29.021 "get_zone_info": false, 00:15:29.021 "zone_management": false, 00:15:29.021 "zone_append": false, 00:15:29.021 "compare": false, 00:15:29.021 "compare_and_write": false, 00:15:29.021 "abort": true, 00:15:29.021 "seek_hole": false, 00:15:29.021 "seek_data": false, 00:15:29.021 "copy": true, 00:15:29.021 "nvme_iov_md": false 00:15:29.021 }, 00:15:29.021 "memory_domains": [ 00:15:29.021 { 00:15:29.021 "dma_device_id": "system", 00:15:29.021 "dma_device_type": 1 00:15:29.021 }, 00:15:29.021 { 00:15:29.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.021 "dma_device_type": 2 00:15:29.021 } 00:15:29.021 ], 00:15:29.021 "driver_specific": {} 00:15:29.021 }' 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:29.021 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:29.280 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:29.280 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:29.280 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:29.280 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:29.538 [2024-07-25 07:21:01.832575] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:29.538 [2024-07-25 07:21:01.832597] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:29.538 [2024-07-25 07:21:01.832644] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:29.538 [2024-07-25 07:21:01.832691] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:29.538 [2024-07-25 07:21:01.832702] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1776000 name Existed_Raid, state offline 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1617240 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1617240 ']' 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1617240 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1617240 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1617240' 00:15:29.538 killing process with pid 1617240 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1617240 00:15:29.538 [2024-07-25 07:21:01.910761] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:29.538 07:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1617240 00:15:29.538 [2024-07-25 07:21:01.933511] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:29.797 07:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:29.797 00:15:29.797 real 0m26.289s 00:15:29.797 user 0m48.159s 00:15:29.797 sys 0m4.825s 00:15:29.797 07:21:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:29.797 07:21:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:29.797 ************************************ 00:15:29.797 END TEST raid_state_function_test_sb 00:15:29.797 ************************************ 00:15:29.797 07:21:02 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:15:29.797 07:21:02 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:15:29.797 07:21:02 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:29.797 07:21:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:29.797 ************************************ 00:15:29.797 START TEST raid_superblock_test 00:15:29.797 ************************************ 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 3 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:15:29.797 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1622360 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1622360 /var/tmp/spdk-raid.sock 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1622360 ']' 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:29.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:29.798 07:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.798 [2024-07-25 07:21:02.266333] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:15:29.798 [2024-07-25 07:21:02.266389] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622360 ] 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:30.057 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:30.057 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:30.057 [2024-07-25 07:21:02.398377] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:30.057 [2024-07-25 07:21:02.482903] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:15:30.057 [2024-07-25 07:21:02.552448] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:30.057 [2024-07-25 07:21:02.552484] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:30.622 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:30.880 malloc1 00:15:30.880 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:31.139 [2024-07-25 07:21:03.562897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:31.139 [2024-07-25 07:21:03.562939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:31.139 [2024-07-25 07:21:03.562960] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd6c280 00:15:31.139 [2024-07-25 07:21:03.562971] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:31.139 [2024-07-25 07:21:03.564557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:31.139 [2024-07-25 07:21:03.564584] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:31.139 pt1 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:31.139 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:31.398 malloc2 00:15:31.398 07:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:31.657 [2024-07-25 07:21:04.016561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:31.657 [2024-07-25 07:21:04.016601] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:31.657 [2024-07-25 07:21:04.016616] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf178c0 00:15:31.657 [2024-07-25 07:21:04.016627] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:31.657 [2024-07-25 07:21:04.017958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:31.657 [2024-07-25 07:21:04.017984] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:31.657 pt2 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:31.657 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:31.915 malloc3 00:15:31.915 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:32.174 [2024-07-25 07:21:04.458229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:32.174 [2024-07-25 07:21:04.458268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:32.174 [2024-07-25 07:21:04.458284] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf17ef0 00:15:32.174 [2024-07-25 07:21:04.458295] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:32.174 [2024-07-25 07:21:04.459634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:32.174 [2024-07-25 07:21:04.459660] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:32.174 pt3 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:32.174 [2024-07-25 07:21:04.674819] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:32.174 [2024-07-25 07:21:04.675926] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:32.174 [2024-07-25 07:21:04.675977] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:32.174 [2024-07-25 07:21:04.676117] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf1b330 00:15:32.174 [2024-07-25 07:21:04.676127] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:32.174 [2024-07-25 07:21:04.676312] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd83050 00:15:32.174 [2024-07-25 07:21:04.676439] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf1b330 00:15:32.174 [2024-07-25 07:21:04.676449] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf1b330 00:15:32.174 [2024-07-25 07:21:04.676536] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.174 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:32.433 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.433 "name": "raid_bdev1", 00:15:32.433 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:32.433 "strip_size_kb": 64, 00:15:32.433 "state": "online", 00:15:32.433 "raid_level": "raid0", 00:15:32.433 "superblock": true, 00:15:32.433 "num_base_bdevs": 3, 00:15:32.433 "num_base_bdevs_discovered": 3, 00:15:32.433 "num_base_bdevs_operational": 3, 00:15:32.433 "base_bdevs_list": [ 00:15:32.433 { 00:15:32.433 "name": "pt1", 00:15:32.433 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:32.433 "is_configured": true, 00:15:32.433 "data_offset": 2048, 00:15:32.433 "data_size": 63488 00:15:32.433 }, 00:15:32.433 { 00:15:32.433 "name": "pt2", 00:15:32.433 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:32.433 "is_configured": true, 00:15:32.433 "data_offset": 2048, 00:15:32.433 "data_size": 63488 00:15:32.433 }, 00:15:32.433 { 00:15:32.433 "name": "pt3", 00:15:32.433 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:32.433 "is_configured": true, 00:15:32.433 "data_offset": 2048, 00:15:32.433 "data_size": 63488 00:15:32.433 } 00:15:32.433 ] 00:15:32.433 }' 00:15:32.433 07:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.433 07:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:33.000 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:33.259 [2024-07-25 07:21:05.705853] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.259 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:33.259 "name": "raid_bdev1", 00:15:33.259 "aliases": [ 00:15:33.259 "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244" 00:15:33.259 ], 00:15:33.259 "product_name": "Raid Volume", 00:15:33.259 "block_size": 512, 00:15:33.259 "num_blocks": 190464, 00:15:33.259 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:33.259 "assigned_rate_limits": { 00:15:33.259 "rw_ios_per_sec": 0, 00:15:33.259 "rw_mbytes_per_sec": 0, 00:15:33.259 "r_mbytes_per_sec": 0, 00:15:33.259 "w_mbytes_per_sec": 0 00:15:33.259 }, 00:15:33.259 "claimed": false, 00:15:33.259 "zoned": false, 00:15:33.259 "supported_io_types": { 00:15:33.259 "read": true, 00:15:33.259 "write": true, 00:15:33.259 "unmap": true, 00:15:33.259 "flush": true, 00:15:33.259 "reset": true, 00:15:33.259 "nvme_admin": false, 00:15:33.259 "nvme_io": false, 00:15:33.259 "nvme_io_md": false, 00:15:33.259 "write_zeroes": true, 00:15:33.259 "zcopy": false, 00:15:33.259 "get_zone_info": false, 00:15:33.259 "zone_management": false, 00:15:33.259 "zone_append": false, 00:15:33.259 "compare": false, 00:15:33.259 "compare_and_write": false, 00:15:33.259 "abort": false, 00:15:33.259 "seek_hole": false, 00:15:33.259 "seek_data": false, 00:15:33.259 "copy": false, 00:15:33.259 "nvme_iov_md": false 00:15:33.259 }, 00:15:33.259 "memory_domains": [ 00:15:33.259 { 00:15:33.259 "dma_device_id": "system", 00:15:33.259 "dma_device_type": 1 00:15:33.259 }, 00:15:33.259 { 00:15:33.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.259 "dma_device_type": 2 00:15:33.259 }, 00:15:33.259 { 00:15:33.259 "dma_device_id": "system", 00:15:33.259 "dma_device_type": 1 00:15:33.259 }, 00:15:33.259 { 00:15:33.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.259 "dma_device_type": 2 00:15:33.259 }, 00:15:33.259 { 00:15:33.259 "dma_device_id": "system", 00:15:33.259 "dma_device_type": 1 00:15:33.259 }, 00:15:33.259 { 00:15:33.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.259 "dma_device_type": 2 00:15:33.259 } 00:15:33.259 ], 00:15:33.260 "driver_specific": { 00:15:33.260 "raid": { 00:15:33.260 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:33.260 "strip_size_kb": 64, 00:15:33.260 "state": "online", 00:15:33.260 "raid_level": "raid0", 00:15:33.260 "superblock": true, 00:15:33.260 "num_base_bdevs": 3, 00:15:33.260 "num_base_bdevs_discovered": 3, 00:15:33.260 "num_base_bdevs_operational": 3, 00:15:33.260 "base_bdevs_list": [ 00:15:33.260 { 00:15:33.260 "name": "pt1", 00:15:33.260 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.260 "is_configured": true, 00:15:33.260 "data_offset": 2048, 00:15:33.260 "data_size": 63488 00:15:33.260 }, 00:15:33.260 { 00:15:33.260 "name": "pt2", 00:15:33.260 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:33.260 "is_configured": true, 00:15:33.260 "data_offset": 2048, 00:15:33.260 "data_size": 63488 00:15:33.260 }, 00:15:33.260 { 00:15:33.260 "name": "pt3", 00:15:33.260 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:33.260 "is_configured": true, 00:15:33.260 "data_offset": 2048, 00:15:33.260 "data_size": 63488 00:15:33.260 } 00:15:33.260 ] 00:15:33.260 } 00:15:33.260 } 00:15:33.260 }' 00:15:33.260 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:33.260 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:33.260 pt2 00:15:33.260 pt3' 00:15:33.260 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.260 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:33.260 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.518 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.518 "name": "pt1", 00:15:33.518 "aliases": [ 00:15:33.518 "00000000-0000-0000-0000-000000000001" 00:15:33.518 ], 00:15:33.518 "product_name": "passthru", 00:15:33.518 "block_size": 512, 00:15:33.518 "num_blocks": 65536, 00:15:33.518 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.518 "assigned_rate_limits": { 00:15:33.518 "rw_ios_per_sec": 0, 00:15:33.518 "rw_mbytes_per_sec": 0, 00:15:33.518 "r_mbytes_per_sec": 0, 00:15:33.518 "w_mbytes_per_sec": 0 00:15:33.518 }, 00:15:33.518 "claimed": true, 00:15:33.518 "claim_type": "exclusive_write", 00:15:33.518 "zoned": false, 00:15:33.518 "supported_io_types": { 00:15:33.518 "read": true, 00:15:33.518 "write": true, 00:15:33.518 "unmap": true, 00:15:33.518 "flush": true, 00:15:33.518 "reset": true, 00:15:33.518 "nvme_admin": false, 00:15:33.518 "nvme_io": false, 00:15:33.518 "nvme_io_md": false, 00:15:33.518 "write_zeroes": true, 00:15:33.518 "zcopy": true, 00:15:33.518 "get_zone_info": false, 00:15:33.518 "zone_management": false, 00:15:33.518 "zone_append": false, 00:15:33.518 "compare": false, 00:15:33.518 "compare_and_write": false, 00:15:33.518 "abort": true, 00:15:33.518 "seek_hole": false, 00:15:33.518 "seek_data": false, 00:15:33.518 "copy": true, 00:15:33.518 "nvme_iov_md": false 00:15:33.518 }, 00:15:33.518 "memory_domains": [ 00:15:33.518 { 00:15:33.518 "dma_device_id": "system", 00:15:33.518 "dma_device_type": 1 00:15:33.518 }, 00:15:33.518 { 00:15:33.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.518 "dma_device_type": 2 00:15:33.518 } 00:15:33.518 ], 00:15:33.518 "driver_specific": { 00:15:33.518 "passthru": { 00:15:33.518 "name": "pt1", 00:15:33.518 "base_bdev_name": "malloc1" 00:15:33.518 } 00:15:33.518 } 00:15:33.519 }' 00:15:33.519 07:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.519 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.779 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.073 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.073 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.073 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:34.073 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.073 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.073 "name": "pt2", 00:15:34.073 "aliases": [ 00:15:34.073 "00000000-0000-0000-0000-000000000002" 00:15:34.073 ], 00:15:34.073 "product_name": "passthru", 00:15:34.073 "block_size": 512, 00:15:34.073 "num_blocks": 65536, 00:15:34.073 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:34.073 "assigned_rate_limits": { 00:15:34.073 "rw_ios_per_sec": 0, 00:15:34.073 "rw_mbytes_per_sec": 0, 00:15:34.073 "r_mbytes_per_sec": 0, 00:15:34.073 "w_mbytes_per_sec": 0 00:15:34.073 }, 00:15:34.073 "claimed": true, 00:15:34.073 "claim_type": "exclusive_write", 00:15:34.073 "zoned": false, 00:15:34.073 "supported_io_types": { 00:15:34.073 "read": true, 00:15:34.073 "write": true, 00:15:34.073 "unmap": true, 00:15:34.073 "flush": true, 00:15:34.073 "reset": true, 00:15:34.073 "nvme_admin": false, 00:15:34.073 "nvme_io": false, 00:15:34.073 "nvme_io_md": false, 00:15:34.073 "write_zeroes": true, 00:15:34.073 "zcopy": true, 00:15:34.073 "get_zone_info": false, 00:15:34.073 "zone_management": false, 00:15:34.073 "zone_append": false, 00:15:34.073 "compare": false, 00:15:34.073 "compare_and_write": false, 00:15:34.073 "abort": true, 00:15:34.073 "seek_hole": false, 00:15:34.073 "seek_data": false, 00:15:34.073 "copy": true, 00:15:34.073 "nvme_iov_md": false 00:15:34.073 }, 00:15:34.073 "memory_domains": [ 00:15:34.073 { 00:15:34.073 "dma_device_id": "system", 00:15:34.073 "dma_device_type": 1 00:15:34.073 }, 00:15:34.073 { 00:15:34.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.073 "dma_device_type": 2 00:15:34.073 } 00:15:34.073 ], 00:15:34.073 "driver_specific": { 00:15:34.073 "passthru": { 00:15:34.073 "name": "pt2", 00:15:34.073 "base_bdev_name": "malloc2" 00:15:34.073 } 00:15:34.073 } 00:15:34.073 }' 00:15:34.073 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.332 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.590 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.590 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.590 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.590 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:34.590 07:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.848 "name": "pt3", 00:15:34.848 "aliases": [ 00:15:34.848 "00000000-0000-0000-0000-000000000003" 00:15:34.848 ], 00:15:34.848 "product_name": "passthru", 00:15:34.848 "block_size": 512, 00:15:34.848 "num_blocks": 65536, 00:15:34.848 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:34.848 "assigned_rate_limits": { 00:15:34.848 "rw_ios_per_sec": 0, 00:15:34.848 "rw_mbytes_per_sec": 0, 00:15:34.848 "r_mbytes_per_sec": 0, 00:15:34.848 "w_mbytes_per_sec": 0 00:15:34.848 }, 00:15:34.848 "claimed": true, 00:15:34.848 "claim_type": "exclusive_write", 00:15:34.848 "zoned": false, 00:15:34.848 "supported_io_types": { 00:15:34.848 "read": true, 00:15:34.848 "write": true, 00:15:34.848 "unmap": true, 00:15:34.848 "flush": true, 00:15:34.848 "reset": true, 00:15:34.848 "nvme_admin": false, 00:15:34.848 "nvme_io": false, 00:15:34.848 "nvme_io_md": false, 00:15:34.848 "write_zeroes": true, 00:15:34.848 "zcopy": true, 00:15:34.848 "get_zone_info": false, 00:15:34.848 "zone_management": false, 00:15:34.848 "zone_append": false, 00:15:34.848 "compare": false, 00:15:34.848 "compare_and_write": false, 00:15:34.848 "abort": true, 00:15:34.848 "seek_hole": false, 00:15:34.848 "seek_data": false, 00:15:34.848 "copy": true, 00:15:34.848 "nvme_iov_md": false 00:15:34.848 }, 00:15:34.848 "memory_domains": [ 00:15:34.848 { 00:15:34.848 "dma_device_id": "system", 00:15:34.848 "dma_device_type": 1 00:15:34.848 }, 00:15:34.848 { 00:15:34.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.848 "dma_device_type": 2 00:15:34.848 } 00:15:34.848 ], 00:15:34.848 "driver_specific": { 00:15:34.848 "passthru": { 00:15:34.848 "name": "pt3", 00:15:34.848 "base_bdev_name": "malloc3" 00:15:34.848 } 00:15:34.848 } 00:15:34.848 }' 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.848 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.106 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:35.106 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.106 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.106 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.106 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:35.106 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:15:35.365 [2024-07-25 07:21:07.711136] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:35.365 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=cb73cdcd-d3e7-4511-8fd9-c26bd8cde244 00:15:35.365 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z cb73cdcd-d3e7-4511-8fd9-c26bd8cde244 ']' 00:15:35.365 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:35.623 [2024-07-25 07:21:07.939493] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:35.623 [2024-07-25 07:21:07.939515] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:35.623 [2024-07-25 07:21:07.939562] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:35.623 [2024-07-25 07:21:07.939608] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:35.623 [2024-07-25 07:21:07.939618] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1b330 name raid_bdev1, state offline 00:15:35.623 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.623 07:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:15:35.882 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:15:35.882 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:15:35.882 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:35.882 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:35.882 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:35.882 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:36.141 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:36.141 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:36.399 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:36.399 07:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:36.658 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:15:36.658 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:36.659 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:36.918 [2024-07-25 07:21:09.315070] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:36.918 [2024-07-25 07:21:09.316338] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:36.918 [2024-07-25 07:21:09.316379] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:36.918 [2024-07-25 07:21:09.316420] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:36.918 [2024-07-25 07:21:09.316453] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:36.918 [2024-07-25 07:21:09.316475] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:36.918 [2024-07-25 07:21:09.316491] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:36.918 [2024-07-25 07:21:09.316500] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf18e30 name raid_bdev1, state configuring 00:15:36.918 request: 00:15:36.918 { 00:15:36.918 "name": "raid_bdev1", 00:15:36.918 "raid_level": "raid0", 00:15:36.918 "base_bdevs": [ 00:15:36.918 "malloc1", 00:15:36.918 "malloc2", 00:15:36.918 "malloc3" 00:15:36.918 ], 00:15:36.918 "strip_size_kb": 64, 00:15:36.918 "superblock": false, 00:15:36.918 "method": "bdev_raid_create", 00:15:36.918 "req_id": 1 00:15:36.918 } 00:15:36.918 Got JSON-RPC error response 00:15:36.918 response: 00:15:36.918 { 00:15:36.918 "code": -17, 00:15:36.918 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:36.918 } 00:15:36.918 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:15:36.918 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:36.918 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:36.918 07:21:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:36.918 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.918 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:15:37.177 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:15:37.177 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:15:37.177 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:37.436 [2024-07-25 07:21:09.760187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:37.436 [2024-07-25 07:21:09.760228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.436 [2024-07-25 07:21:09.760246] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf15490 00:15:37.436 [2024-07-25 07:21:09.760258] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.436 [2024-07-25 07:21:09.761711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.436 [2024-07-25 07:21:09.761738] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:37.436 [2024-07-25 07:21:09.761795] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:37.436 [2024-07-25 07:21:09.761824] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:37.436 pt1 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.436 07:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:37.695 07:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.695 "name": "raid_bdev1", 00:15:37.695 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:37.695 "strip_size_kb": 64, 00:15:37.695 "state": "configuring", 00:15:37.695 "raid_level": "raid0", 00:15:37.695 "superblock": true, 00:15:37.695 "num_base_bdevs": 3, 00:15:37.695 "num_base_bdevs_discovered": 1, 00:15:37.695 "num_base_bdevs_operational": 3, 00:15:37.695 "base_bdevs_list": [ 00:15:37.695 { 00:15:37.695 "name": "pt1", 00:15:37.695 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.695 "is_configured": true, 00:15:37.695 "data_offset": 2048, 00:15:37.695 "data_size": 63488 00:15:37.695 }, 00:15:37.695 { 00:15:37.695 "name": null, 00:15:37.695 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.695 "is_configured": false, 00:15:37.695 "data_offset": 2048, 00:15:37.695 "data_size": 63488 00:15:37.695 }, 00:15:37.695 { 00:15:37.695 "name": null, 00:15:37.695 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:37.695 "is_configured": false, 00:15:37.695 "data_offset": 2048, 00:15:37.695 "data_size": 63488 00:15:37.695 } 00:15:37.695 ] 00:15:37.695 }' 00:15:37.695 07:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.695 07:21:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.263 07:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:15:38.263 07:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:38.522 [2024-07-25 07:21:10.806939] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:38.522 [2024-07-25 07:21:10.806989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:38.522 [2024-07-25 07:21:10.807011] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd6ce70 00:15:38.522 [2024-07-25 07:21:10.807022] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:38.522 [2024-07-25 07:21:10.807331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:38.522 [2024-07-25 07:21:10.807348] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:38.522 [2024-07-25 07:21:10.807402] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:38.522 [2024-07-25 07:21:10.807419] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:38.522 pt2 00:15:38.522 07:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:38.522 [2024-07-25 07:21:11.035558] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.781 "name": "raid_bdev1", 00:15:38.781 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:38.781 "strip_size_kb": 64, 00:15:38.781 "state": "configuring", 00:15:38.781 "raid_level": "raid0", 00:15:38.781 "superblock": true, 00:15:38.781 "num_base_bdevs": 3, 00:15:38.781 "num_base_bdevs_discovered": 1, 00:15:38.781 "num_base_bdevs_operational": 3, 00:15:38.781 "base_bdevs_list": [ 00:15:38.781 { 00:15:38.781 "name": "pt1", 00:15:38.781 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.781 "is_configured": true, 00:15:38.781 "data_offset": 2048, 00:15:38.781 "data_size": 63488 00:15:38.781 }, 00:15:38.781 { 00:15:38.781 "name": null, 00:15:38.781 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.781 "is_configured": false, 00:15:38.781 "data_offset": 2048, 00:15:38.781 "data_size": 63488 00:15:38.781 }, 00:15:38.781 { 00:15:38.781 "name": null, 00:15:38.781 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.781 "is_configured": false, 00:15:38.781 "data_offset": 2048, 00:15:38.781 "data_size": 63488 00:15:38.781 } 00:15:38.781 ] 00:15:38.781 }' 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.781 07:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.349 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:15:39.349 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:39.607 07:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:39.607 [2024-07-25 07:21:12.090338] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:39.607 [2024-07-25 07:21:12.090387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:39.607 [2024-07-25 07:21:12.090404] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1b930 00:15:39.607 [2024-07-25 07:21:12.090416] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:39.607 [2024-07-25 07:21:12.090724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:39.607 [2024-07-25 07:21:12.090740] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:39.608 [2024-07-25 07:21:12.090797] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:39.608 [2024-07-25 07:21:12.090815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:39.608 pt2 00:15:39.608 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:15:39.608 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:39.608 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:39.866 [2024-07-25 07:21:12.318938] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:39.866 [2024-07-25 07:21:12.318978] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:39.866 [2024-07-25 07:21:12.318998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1bc30 00:15:39.866 [2024-07-25 07:21:12.319009] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:39.866 [2024-07-25 07:21:12.319312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:39.866 [2024-07-25 07:21:12.319328] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:39.866 [2024-07-25 07:21:12.319383] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:39.866 [2024-07-25 07:21:12.319400] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:39.866 [2024-07-25 07:21:12.319499] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf1bef0 00:15:39.866 [2024-07-25 07:21:12.319508] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:39.866 [2024-07-25 07:21:12.319664] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd6de40 00:15:39.866 [2024-07-25 07:21:12.319776] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf1bef0 00:15:39.866 [2024-07-25 07:21:12.319785] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf1bef0 00:15:39.866 [2024-07-25 07:21:12.319871] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:39.866 pt3 00:15:39.866 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:15:39.866 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:39.866 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:39.866 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:39.866 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.867 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:40.126 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.126 "name": "raid_bdev1", 00:15:40.126 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:40.126 "strip_size_kb": 64, 00:15:40.126 "state": "online", 00:15:40.126 "raid_level": "raid0", 00:15:40.126 "superblock": true, 00:15:40.126 "num_base_bdevs": 3, 00:15:40.126 "num_base_bdevs_discovered": 3, 00:15:40.126 "num_base_bdevs_operational": 3, 00:15:40.126 "base_bdevs_list": [ 00:15:40.126 { 00:15:40.126 "name": "pt1", 00:15:40.126 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:40.126 "is_configured": true, 00:15:40.126 "data_offset": 2048, 00:15:40.126 "data_size": 63488 00:15:40.126 }, 00:15:40.126 { 00:15:40.126 "name": "pt2", 00:15:40.126 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:40.126 "is_configured": true, 00:15:40.126 "data_offset": 2048, 00:15:40.126 "data_size": 63488 00:15:40.126 }, 00:15:40.126 { 00:15:40.126 "name": "pt3", 00:15:40.126 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:40.126 "is_configured": true, 00:15:40.126 "data_offset": 2048, 00:15:40.126 "data_size": 63488 00:15:40.126 } 00:15:40.126 ] 00:15:40.126 }' 00:15:40.126 07:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.126 07:21:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:40.693 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:41.268 [2024-07-25 07:21:13.642688] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:41.268 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:41.268 "name": "raid_bdev1", 00:15:41.268 "aliases": [ 00:15:41.268 "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244" 00:15:41.268 ], 00:15:41.268 "product_name": "Raid Volume", 00:15:41.269 "block_size": 512, 00:15:41.269 "num_blocks": 190464, 00:15:41.269 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:41.269 "assigned_rate_limits": { 00:15:41.269 "rw_ios_per_sec": 0, 00:15:41.269 "rw_mbytes_per_sec": 0, 00:15:41.269 "r_mbytes_per_sec": 0, 00:15:41.269 "w_mbytes_per_sec": 0 00:15:41.269 }, 00:15:41.269 "claimed": false, 00:15:41.269 "zoned": false, 00:15:41.269 "supported_io_types": { 00:15:41.269 "read": true, 00:15:41.269 "write": true, 00:15:41.269 "unmap": true, 00:15:41.269 "flush": true, 00:15:41.269 "reset": true, 00:15:41.269 "nvme_admin": false, 00:15:41.269 "nvme_io": false, 00:15:41.269 "nvme_io_md": false, 00:15:41.269 "write_zeroes": true, 00:15:41.269 "zcopy": false, 00:15:41.269 "get_zone_info": false, 00:15:41.269 "zone_management": false, 00:15:41.269 "zone_append": false, 00:15:41.269 "compare": false, 00:15:41.269 "compare_and_write": false, 00:15:41.269 "abort": false, 00:15:41.269 "seek_hole": false, 00:15:41.269 "seek_data": false, 00:15:41.269 "copy": false, 00:15:41.269 "nvme_iov_md": false 00:15:41.269 }, 00:15:41.269 "memory_domains": [ 00:15:41.269 { 00:15:41.269 "dma_device_id": "system", 00:15:41.269 "dma_device_type": 1 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.269 "dma_device_type": 2 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "dma_device_id": "system", 00:15:41.269 "dma_device_type": 1 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.269 "dma_device_type": 2 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "dma_device_id": "system", 00:15:41.269 "dma_device_type": 1 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.269 "dma_device_type": 2 00:15:41.269 } 00:15:41.269 ], 00:15:41.269 "driver_specific": { 00:15:41.269 "raid": { 00:15:41.269 "uuid": "cb73cdcd-d3e7-4511-8fd9-c26bd8cde244", 00:15:41.269 "strip_size_kb": 64, 00:15:41.269 "state": "online", 00:15:41.269 "raid_level": "raid0", 00:15:41.269 "superblock": true, 00:15:41.269 "num_base_bdevs": 3, 00:15:41.269 "num_base_bdevs_discovered": 3, 00:15:41.269 "num_base_bdevs_operational": 3, 00:15:41.269 "base_bdevs_list": [ 00:15:41.269 { 00:15:41.269 "name": "pt1", 00:15:41.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.269 "is_configured": true, 00:15:41.269 "data_offset": 2048, 00:15:41.269 "data_size": 63488 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "name": "pt2", 00:15:41.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.269 "is_configured": true, 00:15:41.269 "data_offset": 2048, 00:15:41.269 "data_size": 63488 00:15:41.269 }, 00:15:41.269 { 00:15:41.269 "name": "pt3", 00:15:41.269 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.269 "is_configured": true, 00:15:41.269 "data_offset": 2048, 00:15:41.269 "data_size": 63488 00:15:41.269 } 00:15:41.269 ] 00:15:41.269 } 00:15:41.269 } 00:15:41.269 }' 00:15:41.269 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:41.269 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:41.269 pt2 00:15:41.269 pt3' 00:15:41.269 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.269 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:41.269 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.527 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.527 "name": "pt1", 00:15:41.527 "aliases": [ 00:15:41.527 "00000000-0000-0000-0000-000000000001" 00:15:41.527 ], 00:15:41.527 "product_name": "passthru", 00:15:41.527 "block_size": 512, 00:15:41.527 "num_blocks": 65536, 00:15:41.527 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.527 "assigned_rate_limits": { 00:15:41.527 "rw_ios_per_sec": 0, 00:15:41.527 "rw_mbytes_per_sec": 0, 00:15:41.527 "r_mbytes_per_sec": 0, 00:15:41.527 "w_mbytes_per_sec": 0 00:15:41.527 }, 00:15:41.527 "claimed": true, 00:15:41.527 "claim_type": "exclusive_write", 00:15:41.527 "zoned": false, 00:15:41.527 "supported_io_types": { 00:15:41.527 "read": true, 00:15:41.527 "write": true, 00:15:41.527 "unmap": true, 00:15:41.527 "flush": true, 00:15:41.527 "reset": true, 00:15:41.527 "nvme_admin": false, 00:15:41.527 "nvme_io": false, 00:15:41.527 "nvme_io_md": false, 00:15:41.527 "write_zeroes": true, 00:15:41.527 "zcopy": true, 00:15:41.527 "get_zone_info": false, 00:15:41.527 "zone_management": false, 00:15:41.527 "zone_append": false, 00:15:41.527 "compare": false, 00:15:41.527 "compare_and_write": false, 00:15:41.527 "abort": true, 00:15:41.527 "seek_hole": false, 00:15:41.527 "seek_data": false, 00:15:41.527 "copy": true, 00:15:41.527 "nvme_iov_md": false 00:15:41.527 }, 00:15:41.527 "memory_domains": [ 00:15:41.527 { 00:15:41.527 "dma_device_id": "system", 00:15:41.527 "dma_device_type": 1 00:15:41.527 }, 00:15:41.527 { 00:15:41.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.527 "dma_device_type": 2 00:15:41.527 } 00:15:41.527 ], 00:15:41.527 "driver_specific": { 00:15:41.527 "passthru": { 00:15:41.527 "name": "pt1", 00:15:41.527 "base_bdev_name": "malloc1" 00:15:41.527 } 00:15:41.527 } 00:15:41.527 }' 00:15:41.527 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.527 07:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.527 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.527 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:41.786 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.046 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.046 "name": "pt2", 00:15:42.046 "aliases": [ 00:15:42.046 "00000000-0000-0000-0000-000000000002" 00:15:42.046 ], 00:15:42.046 "product_name": "passthru", 00:15:42.046 "block_size": 512, 00:15:42.046 "num_blocks": 65536, 00:15:42.046 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.046 "assigned_rate_limits": { 00:15:42.046 "rw_ios_per_sec": 0, 00:15:42.046 "rw_mbytes_per_sec": 0, 00:15:42.046 "r_mbytes_per_sec": 0, 00:15:42.046 "w_mbytes_per_sec": 0 00:15:42.046 }, 00:15:42.046 "claimed": true, 00:15:42.046 "claim_type": "exclusive_write", 00:15:42.046 "zoned": false, 00:15:42.046 "supported_io_types": { 00:15:42.046 "read": true, 00:15:42.046 "write": true, 00:15:42.046 "unmap": true, 00:15:42.046 "flush": true, 00:15:42.046 "reset": true, 00:15:42.046 "nvme_admin": false, 00:15:42.046 "nvme_io": false, 00:15:42.046 "nvme_io_md": false, 00:15:42.046 "write_zeroes": true, 00:15:42.046 "zcopy": true, 00:15:42.046 "get_zone_info": false, 00:15:42.046 "zone_management": false, 00:15:42.046 "zone_append": false, 00:15:42.046 "compare": false, 00:15:42.046 "compare_and_write": false, 00:15:42.046 "abort": true, 00:15:42.046 "seek_hole": false, 00:15:42.046 "seek_data": false, 00:15:42.046 "copy": true, 00:15:42.046 "nvme_iov_md": false 00:15:42.046 }, 00:15:42.046 "memory_domains": [ 00:15:42.046 { 00:15:42.046 "dma_device_id": "system", 00:15:42.046 "dma_device_type": 1 00:15:42.046 }, 00:15:42.046 { 00:15:42.046 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.046 "dma_device_type": 2 00:15:42.046 } 00:15:42.046 ], 00:15:42.046 "driver_specific": { 00:15:42.046 "passthru": { 00:15:42.046 "name": "pt2", 00:15:42.046 "base_bdev_name": "malloc2" 00:15:42.046 } 00:15:42.046 } 00:15:42.046 }' 00:15:42.046 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.046 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.305 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.564 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.564 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.564 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:42.564 07:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.564 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.564 "name": "pt3", 00:15:42.564 "aliases": [ 00:15:42.564 "00000000-0000-0000-0000-000000000003" 00:15:42.564 ], 00:15:42.564 "product_name": "passthru", 00:15:42.564 "block_size": 512, 00:15:42.564 "num_blocks": 65536, 00:15:42.564 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:42.564 "assigned_rate_limits": { 00:15:42.564 "rw_ios_per_sec": 0, 00:15:42.564 "rw_mbytes_per_sec": 0, 00:15:42.564 "r_mbytes_per_sec": 0, 00:15:42.564 "w_mbytes_per_sec": 0 00:15:42.564 }, 00:15:42.564 "claimed": true, 00:15:42.564 "claim_type": "exclusive_write", 00:15:42.564 "zoned": false, 00:15:42.564 "supported_io_types": { 00:15:42.564 "read": true, 00:15:42.564 "write": true, 00:15:42.564 "unmap": true, 00:15:42.564 "flush": true, 00:15:42.564 "reset": true, 00:15:42.564 "nvme_admin": false, 00:15:42.564 "nvme_io": false, 00:15:42.564 "nvme_io_md": false, 00:15:42.564 "write_zeroes": true, 00:15:42.564 "zcopy": true, 00:15:42.564 "get_zone_info": false, 00:15:42.564 "zone_management": false, 00:15:42.564 "zone_append": false, 00:15:42.564 "compare": false, 00:15:42.564 "compare_and_write": false, 00:15:42.564 "abort": true, 00:15:42.564 "seek_hole": false, 00:15:42.564 "seek_data": false, 00:15:42.564 "copy": true, 00:15:42.564 "nvme_iov_md": false 00:15:42.564 }, 00:15:42.564 "memory_domains": [ 00:15:42.564 { 00:15:42.564 "dma_device_id": "system", 00:15:42.564 "dma_device_type": 1 00:15:42.564 }, 00:15:42.564 { 00:15:42.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.564 "dma_device_type": 2 00:15:42.564 } 00:15:42.564 ], 00:15:42.564 "driver_specific": { 00:15:42.564 "passthru": { 00:15:42.564 "name": "pt3", 00:15:42.564 "base_bdev_name": "malloc3" 00:15:42.564 } 00:15:42.564 } 00:15:42.564 }' 00:15:42.564 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.822 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.081 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.081 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.081 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.081 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:43.081 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:15:43.340 [2024-07-25 07:21:15.655986] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' cb73cdcd-d3e7-4511-8fd9-c26bd8cde244 '!=' cb73cdcd-d3e7-4511-8fd9-c26bd8cde244 ']' 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1622360 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1622360 ']' 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1622360 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1622360 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1622360' 00:15:43.340 killing process with pid 1622360 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1622360 00:15:43.340 [2024-07-25 07:21:15.733592] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:43.340 [2024-07-25 07:21:15.733640] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:43.340 [2024-07-25 07:21:15.733690] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:43.340 [2024-07-25 07:21:15.733701] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1bef0 name raid_bdev1, state offline 00:15:43.340 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1622360 00:15:43.340 [2024-07-25 07:21:15.757051] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:43.603 07:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:15:43.603 00:15:43.603 real 0m13.740s 00:15:43.603 user 0m24.759s 00:15:43.603 sys 0m2.481s 00:15:43.603 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:43.603 07:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.603 ************************************ 00:15:43.603 END TEST raid_superblock_test 00:15:43.603 ************************************ 00:15:43.603 07:21:15 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:15:43.603 07:21:15 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:43.603 07:21:15 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:43.603 07:21:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:43.603 ************************************ 00:15:43.603 START TEST raid_read_error_test 00:15:43.603 ************************************ 00:15:43.603 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 read 00:15:43.603 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:15:43.603 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:15:43.603 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:15:43.603 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:43.603 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.dKhckhglRM 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1625327 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1625327 /var/tmp/spdk-raid.sock 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1625327 ']' 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:43.604 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:43.604 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.604 [2024-07-25 07:21:16.083517] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:15:43.604 [2024-07-25 07:21:16.083576] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625327 ] 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:43.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.863 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:43.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:43.864 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:43.864 [2024-07-25 07:21:16.218455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:43.864 [2024-07-25 07:21:16.305058] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:15:43.864 [2024-07-25 07:21:16.372968] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.864 [2024-07-25 07:21:16.373006] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:44.802 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:44.802 07:21:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:44.802 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:44.802 07:21:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:44.802 BaseBdev1_malloc 00:15:44.802 07:21:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:45.062 true 00:15:45.062 07:21:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:45.321 [2024-07-25 07:21:17.662160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:45.321 [2024-07-25 07:21:17.662202] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.321 [2024-07-25 07:21:17.662220] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1872a50 00:15:45.321 [2024-07-25 07:21:17.662231] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.321 [2024-07-25 07:21:17.663628] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.321 [2024-07-25 07:21:17.663656] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:45.321 BaseBdev1 00:15:45.321 07:21:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:45.321 07:21:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:45.580 BaseBdev2_malloc 00:15:45.580 07:21:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:45.839 true 00:15:45.839 07:21:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:45.839 [2024-07-25 07:21:18.352287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:45.839 [2024-07-25 07:21:18.352326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.839 [2024-07-25 07:21:18.352343] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a1bf40 00:15:45.840 [2024-07-25 07:21:18.352355] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.840 [2024-07-25 07:21:18.353616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.840 [2024-07-25 07:21:18.353641] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:45.840 BaseBdev2 00:15:45.840 07:21:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:45.840 07:21:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:46.099 BaseBdev3_malloc 00:15:46.099 07:21:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:46.358 true 00:15:46.358 07:21:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:46.618 [2024-07-25 07:21:19.026092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:46.618 [2024-07-25 07:21:19.026126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.618 [2024-07-25 07:21:19.026147] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a1f250 00:15:46.618 [2024-07-25 07:21:19.026158] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.618 [2024-07-25 07:21:19.027417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.618 [2024-07-25 07:21:19.027442] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:46.618 BaseBdev3 00:15:46.618 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:46.900 [2024-07-25 07:21:19.246713] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.900 [2024-07-25 07:21:19.247848] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:46.900 [2024-07-25 07:21:19.247911] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:46.900 [2024-07-25 07:21:19.248104] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a20010 00:15:46.900 [2024-07-25 07:21:19.248114] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:46.900 [2024-07-25 07:21:19.248290] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186ec70 00:15:46.900 [2024-07-25 07:21:19.248423] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a20010 00:15:46.900 [2024-07-25 07:21:19.248433] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a20010 00:15:46.900 [2024-07-25 07:21:19.248524] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.900 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:47.171 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.171 "name": "raid_bdev1", 00:15:47.171 "uuid": "b4fd1029-d440-435a-84c9-d93434b2c56a", 00:15:47.171 "strip_size_kb": 64, 00:15:47.171 "state": "online", 00:15:47.171 "raid_level": "raid0", 00:15:47.171 "superblock": true, 00:15:47.171 "num_base_bdevs": 3, 00:15:47.171 "num_base_bdevs_discovered": 3, 00:15:47.171 "num_base_bdevs_operational": 3, 00:15:47.171 "base_bdevs_list": [ 00:15:47.171 { 00:15:47.171 "name": "BaseBdev1", 00:15:47.171 "uuid": "7b285def-025d-55dc-8dcd-3c83be655487", 00:15:47.171 "is_configured": true, 00:15:47.171 "data_offset": 2048, 00:15:47.171 "data_size": 63488 00:15:47.171 }, 00:15:47.171 { 00:15:47.171 "name": "BaseBdev2", 00:15:47.171 "uuid": "3f369583-763f-57da-be14-677c89ab4267", 00:15:47.171 "is_configured": true, 00:15:47.171 "data_offset": 2048, 00:15:47.171 "data_size": 63488 00:15:47.171 }, 00:15:47.171 { 00:15:47.171 "name": "BaseBdev3", 00:15:47.171 "uuid": "cef1062b-603e-50f2-b0a3-2a5e15950bcb", 00:15:47.171 "is_configured": true, 00:15:47.171 "data_offset": 2048, 00:15:47.171 "data_size": 63488 00:15:47.171 } 00:15:47.171 ] 00:15:47.171 }' 00:15:47.171 07:21:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.171 07:21:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.739 07:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:47.739 07:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:47.739 [2024-07-25 07:21:20.185437] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186e430 00:15:48.675 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.934 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:49.193 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.193 "name": "raid_bdev1", 00:15:49.193 "uuid": "b4fd1029-d440-435a-84c9-d93434b2c56a", 00:15:49.193 "strip_size_kb": 64, 00:15:49.193 "state": "online", 00:15:49.193 "raid_level": "raid0", 00:15:49.193 "superblock": true, 00:15:49.193 "num_base_bdevs": 3, 00:15:49.193 "num_base_bdevs_discovered": 3, 00:15:49.193 "num_base_bdevs_operational": 3, 00:15:49.193 "base_bdevs_list": [ 00:15:49.193 { 00:15:49.193 "name": "BaseBdev1", 00:15:49.193 "uuid": "7b285def-025d-55dc-8dcd-3c83be655487", 00:15:49.193 "is_configured": true, 00:15:49.193 "data_offset": 2048, 00:15:49.193 "data_size": 63488 00:15:49.193 }, 00:15:49.193 { 00:15:49.193 "name": "BaseBdev2", 00:15:49.193 "uuid": "3f369583-763f-57da-be14-677c89ab4267", 00:15:49.193 "is_configured": true, 00:15:49.193 "data_offset": 2048, 00:15:49.193 "data_size": 63488 00:15:49.193 }, 00:15:49.193 { 00:15:49.193 "name": "BaseBdev3", 00:15:49.193 "uuid": "cef1062b-603e-50f2-b0a3-2a5e15950bcb", 00:15:49.193 "is_configured": true, 00:15:49.193 "data_offset": 2048, 00:15:49.193 "data_size": 63488 00:15:49.193 } 00:15:49.193 ] 00:15:49.193 }' 00:15:49.193 07:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.193 07:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.761 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:50.020 [2024-07-25 07:21:22.351552] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:50.020 [2024-07-25 07:21:22.351585] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:50.020 [2024-07-25 07:21:22.354509] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:50.020 [2024-07-25 07:21:22.354542] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:50.020 [2024-07-25 07:21:22.354573] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:50.020 [2024-07-25 07:21:22.354583] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a20010 name raid_bdev1, state offline 00:15:50.020 0 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1625327 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1625327 ']' 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1625327 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1625327 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1625327' 00:15:50.020 killing process with pid 1625327 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1625327 00:15:50.020 [2024-07-25 07:21:22.416835] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:50.020 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1625327 00:15:50.020 [2024-07-25 07:21:22.434607] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.dKhckhglRM 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:50.279 00:15:50.279 real 0m6.617s 00:15:50.279 user 0m10.438s 00:15:50.279 sys 0m1.161s 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:50.279 07:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.279 ************************************ 00:15:50.279 END TEST raid_read_error_test 00:15:50.279 ************************************ 00:15:50.279 07:21:22 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:50.279 07:21:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:50.279 07:21:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:50.279 07:21:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:50.279 ************************************ 00:15:50.279 START TEST raid_write_error_test 00:15:50.279 ************************************ 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 write 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.Qo11fwmko4 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1626696 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1626696 /var/tmp/spdk-raid.sock 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1626696 ']' 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:50.280 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:50.280 07:21:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.280 [2024-07-25 07:21:22.788209] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:15:50.280 [2024-07-25 07:21:22.788271] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626696 ] 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:50.539 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.539 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:50.540 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.540 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:50.540 [2024-07-25 07:21:22.921201] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:50.540 [2024-07-25 07:21:23.007748] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:15:50.540 [2024-07-25 07:21:23.066481] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:50.540 [2024-07-25 07:21:23.066517] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:51.474 07:21:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:51.474 07:21:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:51.474 07:21:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:51.474 07:21:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:51.474 BaseBdev1_malloc 00:15:51.474 07:21:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:51.733 true 00:15:51.733 07:21:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:51.992 [2024-07-25 07:21:24.364548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:51.992 [2024-07-25 07:21:24.364587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.992 [2024-07-25 07:21:24.364605] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a9a50 00:15:51.992 [2024-07-25 07:21:24.364616] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.992 [2024-07-25 07:21:24.366087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.992 [2024-07-25 07:21:24.366116] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:51.992 BaseBdev1 00:15:51.992 07:21:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:51.992 07:21:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:52.251 BaseBdev2_malloc 00:15:52.251 07:21:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:52.509 true 00:15:52.509 07:21:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:52.509 [2024-07-25 07:21:25.034716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:52.509 [2024-07-25 07:21:25.034756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:52.509 [2024-07-25 07:21:25.034774] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a52f40 00:15:52.509 [2024-07-25 07:21:25.034786] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:52.509 [2024-07-25 07:21:25.036172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:52.509 [2024-07-25 07:21:25.036203] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:52.509 BaseBdev2 00:15:52.768 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:52.768 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:52.768 BaseBdev3_malloc 00:15:52.768 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:53.027 true 00:15:53.027 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:53.286 [2024-07-25 07:21:25.716803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:53.286 [2024-07-25 07:21:25.716843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.286 [2024-07-25 07:21:25.716860] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a56250 00:15:53.286 [2024-07-25 07:21:25.716872] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.286 [2024-07-25 07:21:25.718241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.286 [2024-07-25 07:21:25.718269] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:53.286 BaseBdev3 00:15:53.286 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:53.545 [2024-07-25 07:21:25.933401] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:53.545 [2024-07-25 07:21:25.934531] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:53.545 [2024-07-25 07:21:25.934595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:53.545 [2024-07-25 07:21:25.934784] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a57010 00:15:53.545 [2024-07-25 07:21:25.934795] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:53.545 [2024-07-25 07:21:25.934963] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a5c70 00:15:53.545 [2024-07-25 07:21:25.935096] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a57010 00:15:53.545 [2024-07-25 07:21:25.935105] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a57010 00:15:53.545 [2024-07-25 07:21:25.935210] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.545 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.546 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.546 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:53.546 07:21:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.804 07:21:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.804 "name": "raid_bdev1", 00:15:53.805 "uuid": "b8d7d72c-2f2a-487a-a218-21050b6ef487", 00:15:53.805 "strip_size_kb": 64, 00:15:53.805 "state": "online", 00:15:53.805 "raid_level": "raid0", 00:15:53.805 "superblock": true, 00:15:53.805 "num_base_bdevs": 3, 00:15:53.805 "num_base_bdevs_discovered": 3, 00:15:53.805 "num_base_bdevs_operational": 3, 00:15:53.805 "base_bdevs_list": [ 00:15:53.805 { 00:15:53.805 "name": "BaseBdev1", 00:15:53.805 "uuid": "94a0a1e1-3c2b-53b6-8100-81ab42c5b50f", 00:15:53.805 "is_configured": true, 00:15:53.805 "data_offset": 2048, 00:15:53.805 "data_size": 63488 00:15:53.805 }, 00:15:53.805 { 00:15:53.805 "name": "BaseBdev2", 00:15:53.805 "uuid": "206d37b5-33f4-56fa-907b-2601118fa632", 00:15:53.805 "is_configured": true, 00:15:53.805 "data_offset": 2048, 00:15:53.805 "data_size": 63488 00:15:53.805 }, 00:15:53.805 { 00:15:53.805 "name": "BaseBdev3", 00:15:53.805 "uuid": "ee549142-593d-5952-809a-30cbff383709", 00:15:53.805 "is_configured": true, 00:15:53.805 "data_offset": 2048, 00:15:53.805 "data_size": 63488 00:15:53.805 } 00:15:53.805 ] 00:15:53.805 }' 00:15:53.805 07:21:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.805 07:21:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.372 07:21:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:54.372 07:21:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:54.372 [2024-07-25 07:21:26.844032] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a5430 00:15:55.310 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.569 07:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:55.829 07:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.829 "name": "raid_bdev1", 00:15:55.829 "uuid": "b8d7d72c-2f2a-487a-a218-21050b6ef487", 00:15:55.829 "strip_size_kb": 64, 00:15:55.829 "state": "online", 00:15:55.829 "raid_level": "raid0", 00:15:55.829 "superblock": true, 00:15:55.829 "num_base_bdevs": 3, 00:15:55.829 "num_base_bdevs_discovered": 3, 00:15:55.829 "num_base_bdevs_operational": 3, 00:15:55.829 "base_bdevs_list": [ 00:15:55.829 { 00:15:55.829 "name": "BaseBdev1", 00:15:55.829 "uuid": "94a0a1e1-3c2b-53b6-8100-81ab42c5b50f", 00:15:55.829 "is_configured": true, 00:15:55.829 "data_offset": 2048, 00:15:55.829 "data_size": 63488 00:15:55.829 }, 00:15:55.829 { 00:15:55.829 "name": "BaseBdev2", 00:15:55.829 "uuid": "206d37b5-33f4-56fa-907b-2601118fa632", 00:15:55.829 "is_configured": true, 00:15:55.829 "data_offset": 2048, 00:15:55.829 "data_size": 63488 00:15:55.829 }, 00:15:55.829 { 00:15:55.829 "name": "BaseBdev3", 00:15:55.829 "uuid": "ee549142-593d-5952-809a-30cbff383709", 00:15:55.829 "is_configured": true, 00:15:55.829 "data_offset": 2048, 00:15:55.829 "data_size": 63488 00:15:55.829 } 00:15:55.829 ] 00:15:55.829 }' 00:15:55.829 07:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.829 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:56.398 [2024-07-25 07:21:28.841134] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:56.398 [2024-07-25 07:21:28.841180] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:56.398 [2024-07-25 07:21:28.844061] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:56.398 [2024-07-25 07:21:28.844094] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:56.398 [2024-07-25 07:21:28.844124] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:56.398 [2024-07-25 07:21:28.844135] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a57010 name raid_bdev1, state offline 00:15:56.398 0 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1626696 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1626696 ']' 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1626696 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1626696 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1626696' 00:15:56.398 killing process with pid 1626696 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1626696 00:15:56.398 [2024-07-25 07:21:28.918988] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:56.398 07:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1626696 00:15:56.657 [2024-07-25 07:21:28.937025] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:56.657 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.Qo11fwmko4 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:15:56.658 00:15:56.658 real 0m6.427s 00:15:56.658 user 0m10.075s 00:15:56.658 sys 0m1.102s 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:56.658 07:21:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.658 ************************************ 00:15:56.658 END TEST raid_write_error_test 00:15:56.658 ************************************ 00:15:56.658 07:21:29 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:15:56.658 07:21:29 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:56.658 07:21:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:56.658 07:21:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:56.658 07:21:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:56.918 ************************************ 00:15:56.918 START TEST raid_state_function_test 00:15:56.918 ************************************ 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 false 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1627884 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1627884' 00:15:56.918 Process raid pid: 1627884 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1627884 /var/tmp/spdk-raid.sock 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1627884 ']' 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:56.918 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:56.918 07:21:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.918 [2024-07-25 07:21:29.286688] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:15:56.918 [2024-07-25 07:21:29.286746] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:56.918 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.918 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:56.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.919 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:56.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.919 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:56.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.919 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:56.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.919 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:56.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.919 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:56.919 [2024-07-25 07:21:29.416535] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:57.178 [2024-07-25 07:21:29.502830] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:15:57.178 [2024-07-25 07:21:29.563006] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.178 [2024-07-25 07:21:29.563040] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.746 07:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:57.746 07:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:15:57.746 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:58.006 [2024-07-25 07:21:30.397115] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.006 [2024-07-25 07:21:30.397157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.006 [2024-07-25 07:21:30.397168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.006 [2024-07-25 07:21:30.397178] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.006 [2024-07-25 07:21:30.397186] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:58.006 [2024-07-25 07:21:30.397196] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.006 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.265 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.265 "name": "Existed_Raid", 00:15:58.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.265 "strip_size_kb": 64, 00:15:58.265 "state": "configuring", 00:15:58.265 "raid_level": "concat", 00:15:58.265 "superblock": false, 00:15:58.265 "num_base_bdevs": 3, 00:15:58.265 "num_base_bdevs_discovered": 0, 00:15:58.265 "num_base_bdevs_operational": 3, 00:15:58.265 "base_bdevs_list": [ 00:15:58.265 { 00:15:58.265 "name": "BaseBdev1", 00:15:58.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.265 "is_configured": false, 00:15:58.265 "data_offset": 0, 00:15:58.265 "data_size": 0 00:15:58.265 }, 00:15:58.265 { 00:15:58.265 "name": "BaseBdev2", 00:15:58.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.265 "is_configured": false, 00:15:58.265 "data_offset": 0, 00:15:58.265 "data_size": 0 00:15:58.265 }, 00:15:58.265 { 00:15:58.265 "name": "BaseBdev3", 00:15:58.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.265 "is_configured": false, 00:15:58.265 "data_offset": 0, 00:15:58.265 "data_size": 0 00:15:58.265 } 00:15:58.265 ] 00:15:58.265 }' 00:15:58.265 07:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.265 07:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.833 07:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:58.833 [2024-07-25 07:21:31.283361] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:58.833 [2024-07-25 07:21:31.283384] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b1ec0 name Existed_Raid, state configuring 00:15:58.833 07:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:59.094 [2024-07-25 07:21:31.451814] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:59.094 [2024-07-25 07:21:31.451839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:59.094 [2024-07-25 07:21:31.451849] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:59.094 [2024-07-25 07:21:31.451859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:59.094 [2024-07-25 07:21:31.451867] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:59.094 [2024-07-25 07:21:31.451877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:59.094 07:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:59.353 [2024-07-25 07:21:31.633773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:59.353 BaseBdev1 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.353 07:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:59.611 [ 00:15:59.611 { 00:15:59.611 "name": "BaseBdev1", 00:15:59.611 "aliases": [ 00:15:59.611 "b7aadf51-1459-4066-97b8-34903703f0e6" 00:15:59.611 ], 00:15:59.611 "product_name": "Malloc disk", 00:15:59.611 "block_size": 512, 00:15:59.611 "num_blocks": 65536, 00:15:59.611 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:15:59.611 "assigned_rate_limits": { 00:15:59.611 "rw_ios_per_sec": 0, 00:15:59.611 "rw_mbytes_per_sec": 0, 00:15:59.611 "r_mbytes_per_sec": 0, 00:15:59.611 "w_mbytes_per_sec": 0 00:15:59.611 }, 00:15:59.611 "claimed": true, 00:15:59.611 "claim_type": "exclusive_write", 00:15:59.611 "zoned": false, 00:15:59.611 "supported_io_types": { 00:15:59.611 "read": true, 00:15:59.611 "write": true, 00:15:59.611 "unmap": true, 00:15:59.611 "flush": true, 00:15:59.611 "reset": true, 00:15:59.611 "nvme_admin": false, 00:15:59.611 "nvme_io": false, 00:15:59.611 "nvme_io_md": false, 00:15:59.611 "write_zeroes": true, 00:15:59.611 "zcopy": true, 00:15:59.611 "get_zone_info": false, 00:15:59.611 "zone_management": false, 00:15:59.611 "zone_append": false, 00:15:59.611 "compare": false, 00:15:59.611 "compare_and_write": false, 00:15:59.611 "abort": true, 00:15:59.611 "seek_hole": false, 00:15:59.611 "seek_data": false, 00:15:59.611 "copy": true, 00:15:59.611 "nvme_iov_md": false 00:15:59.612 }, 00:15:59.612 "memory_domains": [ 00:15:59.612 { 00:15:59.612 "dma_device_id": "system", 00:15:59.612 "dma_device_type": 1 00:15:59.612 }, 00:15:59.612 { 00:15:59.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.612 "dma_device_type": 2 00:15:59.612 } 00:15:59.612 ], 00:15:59.612 "driver_specific": {} 00:15:59.612 } 00:15:59.612 ] 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.612 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.871 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.871 "name": "Existed_Raid", 00:15:59.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.871 "strip_size_kb": 64, 00:15:59.871 "state": "configuring", 00:15:59.871 "raid_level": "concat", 00:15:59.871 "superblock": false, 00:15:59.871 "num_base_bdevs": 3, 00:15:59.871 "num_base_bdevs_discovered": 1, 00:15:59.871 "num_base_bdevs_operational": 3, 00:15:59.871 "base_bdevs_list": [ 00:15:59.871 { 00:15:59.871 "name": "BaseBdev1", 00:15:59.871 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:15:59.871 "is_configured": true, 00:15:59.871 "data_offset": 0, 00:15:59.871 "data_size": 65536 00:15:59.871 }, 00:15:59.871 { 00:15:59.871 "name": "BaseBdev2", 00:15:59.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.871 "is_configured": false, 00:15:59.871 "data_offset": 0, 00:15:59.871 "data_size": 0 00:15:59.871 }, 00:15:59.871 { 00:15:59.871 "name": "BaseBdev3", 00:15:59.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.871 "is_configured": false, 00:15:59.871 "data_offset": 0, 00:15:59.871 "data_size": 0 00:15:59.871 } 00:15:59.871 ] 00:15:59.871 }' 00:15:59.871 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.871 07:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.465 07:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:00.725 [2024-07-25 07:21:32.989343] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:00.725 [2024-07-25 07:21:32.989380] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b1790 name Existed_Raid, state configuring 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:00.725 [2024-07-25 07:21:33.221988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:00.725 [2024-07-25 07:21:33.223403] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:00.725 [2024-07-25 07:21:33.223437] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:00.725 [2024-07-25 07:21:33.223447] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:00.725 [2024-07-25 07:21:33.223458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.725 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.990 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.990 "name": "Existed_Raid", 00:16:00.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.990 "strip_size_kb": 64, 00:16:00.990 "state": "configuring", 00:16:00.990 "raid_level": "concat", 00:16:00.990 "superblock": false, 00:16:00.990 "num_base_bdevs": 3, 00:16:00.990 "num_base_bdevs_discovered": 1, 00:16:00.990 "num_base_bdevs_operational": 3, 00:16:00.990 "base_bdevs_list": [ 00:16:00.990 { 00:16:00.990 "name": "BaseBdev1", 00:16:00.990 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:16:00.990 "is_configured": true, 00:16:00.990 "data_offset": 0, 00:16:00.990 "data_size": 65536 00:16:00.990 }, 00:16:00.990 { 00:16:00.990 "name": "BaseBdev2", 00:16:00.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.990 "is_configured": false, 00:16:00.990 "data_offset": 0, 00:16:00.990 "data_size": 0 00:16:00.990 }, 00:16:00.990 { 00:16:00.990 "name": "BaseBdev3", 00:16:00.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.990 "is_configured": false, 00:16:00.990 "data_offset": 0, 00:16:00.990 "data_size": 0 00:16:00.990 } 00:16:00.990 ] 00:16:00.990 }' 00:16:00.990 07:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.990 07:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.562 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:01.822 [2024-07-25 07:21:34.255755] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:01.822 BaseBdev2 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:01.822 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:02.081 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:02.341 [ 00:16:02.341 { 00:16:02.341 "name": "BaseBdev2", 00:16:02.341 "aliases": [ 00:16:02.341 "dec36ac6-bc51-4958-ac24-0ee2e0578e83" 00:16:02.341 ], 00:16:02.341 "product_name": "Malloc disk", 00:16:02.341 "block_size": 512, 00:16:02.341 "num_blocks": 65536, 00:16:02.341 "uuid": "dec36ac6-bc51-4958-ac24-0ee2e0578e83", 00:16:02.341 "assigned_rate_limits": { 00:16:02.341 "rw_ios_per_sec": 0, 00:16:02.341 "rw_mbytes_per_sec": 0, 00:16:02.341 "r_mbytes_per_sec": 0, 00:16:02.341 "w_mbytes_per_sec": 0 00:16:02.341 }, 00:16:02.341 "claimed": true, 00:16:02.341 "claim_type": "exclusive_write", 00:16:02.341 "zoned": false, 00:16:02.341 "supported_io_types": { 00:16:02.341 "read": true, 00:16:02.341 "write": true, 00:16:02.341 "unmap": true, 00:16:02.341 "flush": true, 00:16:02.341 "reset": true, 00:16:02.341 "nvme_admin": false, 00:16:02.341 "nvme_io": false, 00:16:02.341 "nvme_io_md": false, 00:16:02.341 "write_zeroes": true, 00:16:02.341 "zcopy": true, 00:16:02.341 "get_zone_info": false, 00:16:02.341 "zone_management": false, 00:16:02.341 "zone_append": false, 00:16:02.341 "compare": false, 00:16:02.341 "compare_and_write": false, 00:16:02.341 "abort": true, 00:16:02.341 "seek_hole": false, 00:16:02.341 "seek_data": false, 00:16:02.341 "copy": true, 00:16:02.341 "nvme_iov_md": false 00:16:02.341 }, 00:16:02.341 "memory_domains": [ 00:16:02.341 { 00:16:02.341 "dma_device_id": "system", 00:16:02.341 "dma_device_type": 1 00:16:02.341 }, 00:16:02.341 { 00:16:02.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.341 "dma_device_type": 2 00:16:02.341 } 00:16:02.341 ], 00:16:02.341 "driver_specific": {} 00:16:02.341 } 00:16:02.341 ] 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.341 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.600 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.600 "name": "Existed_Raid", 00:16:02.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.600 "strip_size_kb": 64, 00:16:02.600 "state": "configuring", 00:16:02.600 "raid_level": "concat", 00:16:02.600 "superblock": false, 00:16:02.600 "num_base_bdevs": 3, 00:16:02.600 "num_base_bdevs_discovered": 2, 00:16:02.600 "num_base_bdevs_operational": 3, 00:16:02.600 "base_bdevs_list": [ 00:16:02.600 { 00:16:02.600 "name": "BaseBdev1", 00:16:02.600 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:16:02.600 "is_configured": true, 00:16:02.600 "data_offset": 0, 00:16:02.600 "data_size": 65536 00:16:02.600 }, 00:16:02.601 { 00:16:02.601 "name": "BaseBdev2", 00:16:02.601 "uuid": "dec36ac6-bc51-4958-ac24-0ee2e0578e83", 00:16:02.601 "is_configured": true, 00:16:02.601 "data_offset": 0, 00:16:02.601 "data_size": 65536 00:16:02.601 }, 00:16:02.601 { 00:16:02.601 "name": "BaseBdev3", 00:16:02.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.601 "is_configured": false, 00:16:02.601 "data_offset": 0, 00:16:02.601 "data_size": 0 00:16:02.601 } 00:16:02.601 ] 00:16:02.601 }' 00:16:02.601 07:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.601 07:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.539 07:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:03.539 [2024-07-25 07:21:36.031516] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.539 [2024-07-25 07:21:36.031549] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15b2680 00:16:03.539 [2024-07-25 07:21:36.031557] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:03.539 [2024-07-25 07:21:36.031730] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15b40b0 00:16:03.539 [2024-07-25 07:21:36.031842] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15b2680 00:16:03.539 [2024-07-25 07:21:36.031851] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15b2680 00:16:03.539 [2024-07-25 07:21:36.031994] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.539 BaseBdev3 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:03.539 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.798 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:04.057 [ 00:16:04.057 { 00:16:04.057 "name": "BaseBdev3", 00:16:04.057 "aliases": [ 00:16:04.057 "848f8fb3-6655-42be-8017-11c4a00034ea" 00:16:04.057 ], 00:16:04.057 "product_name": "Malloc disk", 00:16:04.057 "block_size": 512, 00:16:04.057 "num_blocks": 65536, 00:16:04.057 "uuid": "848f8fb3-6655-42be-8017-11c4a00034ea", 00:16:04.057 "assigned_rate_limits": { 00:16:04.057 "rw_ios_per_sec": 0, 00:16:04.057 "rw_mbytes_per_sec": 0, 00:16:04.057 "r_mbytes_per_sec": 0, 00:16:04.057 "w_mbytes_per_sec": 0 00:16:04.057 }, 00:16:04.057 "claimed": true, 00:16:04.057 "claim_type": "exclusive_write", 00:16:04.057 "zoned": false, 00:16:04.057 "supported_io_types": { 00:16:04.057 "read": true, 00:16:04.057 "write": true, 00:16:04.057 "unmap": true, 00:16:04.057 "flush": true, 00:16:04.057 "reset": true, 00:16:04.057 "nvme_admin": false, 00:16:04.057 "nvme_io": false, 00:16:04.057 "nvme_io_md": false, 00:16:04.057 "write_zeroes": true, 00:16:04.057 "zcopy": true, 00:16:04.057 "get_zone_info": false, 00:16:04.057 "zone_management": false, 00:16:04.057 "zone_append": false, 00:16:04.058 "compare": false, 00:16:04.058 "compare_and_write": false, 00:16:04.058 "abort": true, 00:16:04.058 "seek_hole": false, 00:16:04.058 "seek_data": false, 00:16:04.058 "copy": true, 00:16:04.058 "nvme_iov_md": false 00:16:04.058 }, 00:16:04.058 "memory_domains": [ 00:16:04.058 { 00:16:04.058 "dma_device_id": "system", 00:16:04.058 "dma_device_type": 1 00:16:04.058 }, 00:16:04.058 { 00:16:04.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.058 "dma_device_type": 2 00:16:04.058 } 00:16:04.058 ], 00:16:04.058 "driver_specific": {} 00:16:04.058 } 00:16:04.058 ] 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.058 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.317 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.317 "name": "Existed_Raid", 00:16:04.317 "uuid": "1dc9c7cc-e7cd-4e6a-8634-352432154bca", 00:16:04.317 "strip_size_kb": 64, 00:16:04.317 "state": "online", 00:16:04.317 "raid_level": "concat", 00:16:04.317 "superblock": false, 00:16:04.317 "num_base_bdevs": 3, 00:16:04.317 "num_base_bdevs_discovered": 3, 00:16:04.317 "num_base_bdevs_operational": 3, 00:16:04.317 "base_bdevs_list": [ 00:16:04.317 { 00:16:04.317 "name": "BaseBdev1", 00:16:04.317 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:16:04.317 "is_configured": true, 00:16:04.317 "data_offset": 0, 00:16:04.317 "data_size": 65536 00:16:04.317 }, 00:16:04.317 { 00:16:04.317 "name": "BaseBdev2", 00:16:04.317 "uuid": "dec36ac6-bc51-4958-ac24-0ee2e0578e83", 00:16:04.317 "is_configured": true, 00:16:04.317 "data_offset": 0, 00:16:04.317 "data_size": 65536 00:16:04.317 }, 00:16:04.317 { 00:16:04.317 "name": "BaseBdev3", 00:16:04.317 "uuid": "848f8fb3-6655-42be-8017-11c4a00034ea", 00:16:04.317 "is_configured": true, 00:16:04.317 "data_offset": 0, 00:16:04.317 "data_size": 65536 00:16:04.317 } 00:16:04.317 ] 00:16:04.317 }' 00:16:04.317 07:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.317 07:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:04.885 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:05.144 [2024-07-25 07:21:37.487623] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:05.144 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:05.144 "name": "Existed_Raid", 00:16:05.144 "aliases": [ 00:16:05.144 "1dc9c7cc-e7cd-4e6a-8634-352432154bca" 00:16:05.144 ], 00:16:05.144 "product_name": "Raid Volume", 00:16:05.144 "block_size": 512, 00:16:05.144 "num_blocks": 196608, 00:16:05.144 "uuid": "1dc9c7cc-e7cd-4e6a-8634-352432154bca", 00:16:05.144 "assigned_rate_limits": { 00:16:05.144 "rw_ios_per_sec": 0, 00:16:05.144 "rw_mbytes_per_sec": 0, 00:16:05.144 "r_mbytes_per_sec": 0, 00:16:05.144 "w_mbytes_per_sec": 0 00:16:05.144 }, 00:16:05.144 "claimed": false, 00:16:05.144 "zoned": false, 00:16:05.144 "supported_io_types": { 00:16:05.144 "read": true, 00:16:05.144 "write": true, 00:16:05.144 "unmap": true, 00:16:05.144 "flush": true, 00:16:05.144 "reset": true, 00:16:05.144 "nvme_admin": false, 00:16:05.145 "nvme_io": false, 00:16:05.145 "nvme_io_md": false, 00:16:05.145 "write_zeroes": true, 00:16:05.145 "zcopy": false, 00:16:05.145 "get_zone_info": false, 00:16:05.145 "zone_management": false, 00:16:05.145 "zone_append": false, 00:16:05.145 "compare": false, 00:16:05.145 "compare_and_write": false, 00:16:05.145 "abort": false, 00:16:05.145 "seek_hole": false, 00:16:05.145 "seek_data": false, 00:16:05.145 "copy": false, 00:16:05.145 "nvme_iov_md": false 00:16:05.145 }, 00:16:05.145 "memory_domains": [ 00:16:05.145 { 00:16:05.145 "dma_device_id": "system", 00:16:05.145 "dma_device_type": 1 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.145 "dma_device_type": 2 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "dma_device_id": "system", 00:16:05.145 "dma_device_type": 1 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.145 "dma_device_type": 2 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "dma_device_id": "system", 00:16:05.145 "dma_device_type": 1 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.145 "dma_device_type": 2 00:16:05.145 } 00:16:05.145 ], 00:16:05.145 "driver_specific": { 00:16:05.145 "raid": { 00:16:05.145 "uuid": "1dc9c7cc-e7cd-4e6a-8634-352432154bca", 00:16:05.145 "strip_size_kb": 64, 00:16:05.145 "state": "online", 00:16:05.145 "raid_level": "concat", 00:16:05.145 "superblock": false, 00:16:05.145 "num_base_bdevs": 3, 00:16:05.145 "num_base_bdevs_discovered": 3, 00:16:05.145 "num_base_bdevs_operational": 3, 00:16:05.145 "base_bdevs_list": [ 00:16:05.145 { 00:16:05.145 "name": "BaseBdev1", 00:16:05.145 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:16:05.145 "is_configured": true, 00:16:05.145 "data_offset": 0, 00:16:05.145 "data_size": 65536 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "name": "BaseBdev2", 00:16:05.145 "uuid": "dec36ac6-bc51-4958-ac24-0ee2e0578e83", 00:16:05.145 "is_configured": true, 00:16:05.145 "data_offset": 0, 00:16:05.145 "data_size": 65536 00:16:05.145 }, 00:16:05.145 { 00:16:05.145 "name": "BaseBdev3", 00:16:05.145 "uuid": "848f8fb3-6655-42be-8017-11c4a00034ea", 00:16:05.145 "is_configured": true, 00:16:05.145 "data_offset": 0, 00:16:05.145 "data_size": 65536 00:16:05.145 } 00:16:05.145 ] 00:16:05.145 } 00:16:05.145 } 00:16:05.145 }' 00:16:05.145 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:05.145 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:05.145 BaseBdev2 00:16:05.145 BaseBdev3' 00:16:05.145 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.145 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:05.145 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.404 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.404 "name": "BaseBdev1", 00:16:05.404 "aliases": [ 00:16:05.404 "b7aadf51-1459-4066-97b8-34903703f0e6" 00:16:05.404 ], 00:16:05.404 "product_name": "Malloc disk", 00:16:05.404 "block_size": 512, 00:16:05.404 "num_blocks": 65536, 00:16:05.404 "uuid": "b7aadf51-1459-4066-97b8-34903703f0e6", 00:16:05.404 "assigned_rate_limits": { 00:16:05.404 "rw_ios_per_sec": 0, 00:16:05.404 "rw_mbytes_per_sec": 0, 00:16:05.404 "r_mbytes_per_sec": 0, 00:16:05.404 "w_mbytes_per_sec": 0 00:16:05.404 }, 00:16:05.404 "claimed": true, 00:16:05.404 "claim_type": "exclusive_write", 00:16:05.404 "zoned": false, 00:16:05.404 "supported_io_types": { 00:16:05.404 "read": true, 00:16:05.404 "write": true, 00:16:05.404 "unmap": true, 00:16:05.404 "flush": true, 00:16:05.404 "reset": true, 00:16:05.404 "nvme_admin": false, 00:16:05.404 "nvme_io": false, 00:16:05.404 "nvme_io_md": false, 00:16:05.404 "write_zeroes": true, 00:16:05.404 "zcopy": true, 00:16:05.404 "get_zone_info": false, 00:16:05.404 "zone_management": false, 00:16:05.404 "zone_append": false, 00:16:05.404 "compare": false, 00:16:05.404 "compare_and_write": false, 00:16:05.404 "abort": true, 00:16:05.404 "seek_hole": false, 00:16:05.404 "seek_data": false, 00:16:05.404 "copy": true, 00:16:05.404 "nvme_iov_md": false 00:16:05.404 }, 00:16:05.404 "memory_domains": [ 00:16:05.404 { 00:16:05.404 "dma_device_id": "system", 00:16:05.404 "dma_device_type": 1 00:16:05.404 }, 00:16:05.404 { 00:16:05.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.404 "dma_device_type": 2 00:16:05.404 } 00:16:05.404 ], 00:16:05.404 "driver_specific": {} 00:16:05.404 }' 00:16:05.404 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.404 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.404 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.404 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.404 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.663 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.663 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.663 07:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.663 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.663 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.663 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.663 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:05.663 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.663 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:05.664 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.923 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.923 "name": "BaseBdev2", 00:16:05.923 "aliases": [ 00:16:05.923 "dec36ac6-bc51-4958-ac24-0ee2e0578e83" 00:16:05.923 ], 00:16:05.923 "product_name": "Malloc disk", 00:16:05.923 "block_size": 512, 00:16:05.923 "num_blocks": 65536, 00:16:05.923 "uuid": "dec36ac6-bc51-4958-ac24-0ee2e0578e83", 00:16:05.923 "assigned_rate_limits": { 00:16:05.923 "rw_ios_per_sec": 0, 00:16:05.923 "rw_mbytes_per_sec": 0, 00:16:05.923 "r_mbytes_per_sec": 0, 00:16:05.923 "w_mbytes_per_sec": 0 00:16:05.923 }, 00:16:05.923 "claimed": true, 00:16:05.923 "claim_type": "exclusive_write", 00:16:05.923 "zoned": false, 00:16:05.923 "supported_io_types": { 00:16:05.923 "read": true, 00:16:05.923 "write": true, 00:16:05.923 "unmap": true, 00:16:05.923 "flush": true, 00:16:05.923 "reset": true, 00:16:05.923 "nvme_admin": false, 00:16:05.923 "nvme_io": false, 00:16:05.923 "nvme_io_md": false, 00:16:05.923 "write_zeroes": true, 00:16:05.923 "zcopy": true, 00:16:05.923 "get_zone_info": false, 00:16:05.923 "zone_management": false, 00:16:05.923 "zone_append": false, 00:16:05.923 "compare": false, 00:16:05.923 "compare_and_write": false, 00:16:05.923 "abort": true, 00:16:05.923 "seek_hole": false, 00:16:05.923 "seek_data": false, 00:16:05.923 "copy": true, 00:16:05.923 "nvme_iov_md": false 00:16:05.923 }, 00:16:05.923 "memory_domains": [ 00:16:05.923 { 00:16:05.923 "dma_device_id": "system", 00:16:05.923 "dma_device_type": 1 00:16:05.923 }, 00:16:05.923 { 00:16:05.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.923 "dma_device_type": 2 00:16:05.923 } 00:16:05.923 ], 00:16:05.923 "driver_specific": {} 00:16:05.923 }' 00:16:05.923 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.923 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.923 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.923 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:06.182 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.441 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.441 "name": "BaseBdev3", 00:16:06.441 "aliases": [ 00:16:06.441 "848f8fb3-6655-42be-8017-11c4a00034ea" 00:16:06.441 ], 00:16:06.441 "product_name": "Malloc disk", 00:16:06.441 "block_size": 512, 00:16:06.441 "num_blocks": 65536, 00:16:06.441 "uuid": "848f8fb3-6655-42be-8017-11c4a00034ea", 00:16:06.441 "assigned_rate_limits": { 00:16:06.441 "rw_ios_per_sec": 0, 00:16:06.441 "rw_mbytes_per_sec": 0, 00:16:06.441 "r_mbytes_per_sec": 0, 00:16:06.441 "w_mbytes_per_sec": 0 00:16:06.441 }, 00:16:06.441 "claimed": true, 00:16:06.441 "claim_type": "exclusive_write", 00:16:06.441 "zoned": false, 00:16:06.441 "supported_io_types": { 00:16:06.441 "read": true, 00:16:06.441 "write": true, 00:16:06.441 "unmap": true, 00:16:06.441 "flush": true, 00:16:06.441 "reset": true, 00:16:06.441 "nvme_admin": false, 00:16:06.441 "nvme_io": false, 00:16:06.441 "nvme_io_md": false, 00:16:06.441 "write_zeroes": true, 00:16:06.441 "zcopy": true, 00:16:06.441 "get_zone_info": false, 00:16:06.441 "zone_management": false, 00:16:06.441 "zone_append": false, 00:16:06.441 "compare": false, 00:16:06.441 "compare_and_write": false, 00:16:06.441 "abort": true, 00:16:06.441 "seek_hole": false, 00:16:06.441 "seek_data": false, 00:16:06.441 "copy": true, 00:16:06.441 "nvme_iov_md": false 00:16:06.441 }, 00:16:06.441 "memory_domains": [ 00:16:06.441 { 00:16:06.441 "dma_device_id": "system", 00:16:06.441 "dma_device_type": 1 00:16:06.441 }, 00:16:06.441 { 00:16:06.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.441 "dma_device_type": 2 00:16:06.441 } 00:16:06.441 ], 00:16:06.441 "driver_specific": {} 00:16:06.441 }' 00:16:06.441 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.441 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.700 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.700 07:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.700 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:06.959 [2024-07-25 07:21:39.452586] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:06.959 [2024-07-25 07:21:39.452608] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:06.959 [2024-07-25 07:21:39.452644] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.959 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.960 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.960 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.960 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.960 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.217 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.218 "name": "Existed_Raid", 00:16:07.218 "uuid": "1dc9c7cc-e7cd-4e6a-8634-352432154bca", 00:16:07.218 "strip_size_kb": 64, 00:16:07.218 "state": "offline", 00:16:07.218 "raid_level": "concat", 00:16:07.218 "superblock": false, 00:16:07.218 "num_base_bdevs": 3, 00:16:07.218 "num_base_bdevs_discovered": 2, 00:16:07.218 "num_base_bdevs_operational": 2, 00:16:07.218 "base_bdevs_list": [ 00:16:07.218 { 00:16:07.218 "name": null, 00:16:07.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.218 "is_configured": false, 00:16:07.218 "data_offset": 0, 00:16:07.218 "data_size": 65536 00:16:07.218 }, 00:16:07.218 { 00:16:07.218 "name": "BaseBdev2", 00:16:07.218 "uuid": "dec36ac6-bc51-4958-ac24-0ee2e0578e83", 00:16:07.218 "is_configured": true, 00:16:07.218 "data_offset": 0, 00:16:07.218 "data_size": 65536 00:16:07.218 }, 00:16:07.218 { 00:16:07.218 "name": "BaseBdev3", 00:16:07.218 "uuid": "848f8fb3-6655-42be-8017-11c4a00034ea", 00:16:07.218 "is_configured": true, 00:16:07.218 "data_offset": 0, 00:16:07.218 "data_size": 65536 00:16:07.218 } 00:16:07.218 ] 00:16:07.218 }' 00:16:07.218 07:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.218 07:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.784 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:07.784 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:07.785 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.785 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:08.043 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:08.043 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:08.043 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:08.302 [2024-07-25 07:21:40.700873] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:08.302 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:08.302 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:08.302 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.302 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:08.563 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:08.563 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:08.563 07:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:08.822 [2024-07-25 07:21:41.168209] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:08.822 [2024-07-25 07:21:41.168248] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b2680 name Existed_Raid, state offline 00:16:08.822 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:08.822 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:08.822 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.822 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:09.081 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:09.081 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:09.081 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:09.081 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:09.081 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:09.081 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:09.340 BaseBdev2 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:09.340 07:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:09.599 [ 00:16:09.599 { 00:16:09.599 "name": "BaseBdev2", 00:16:09.599 "aliases": [ 00:16:09.599 "616c7e1b-4b08-4d9b-baf6-0888932debcf" 00:16:09.599 ], 00:16:09.599 "product_name": "Malloc disk", 00:16:09.599 "block_size": 512, 00:16:09.599 "num_blocks": 65536, 00:16:09.599 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:09.599 "assigned_rate_limits": { 00:16:09.599 "rw_ios_per_sec": 0, 00:16:09.599 "rw_mbytes_per_sec": 0, 00:16:09.599 "r_mbytes_per_sec": 0, 00:16:09.599 "w_mbytes_per_sec": 0 00:16:09.599 }, 00:16:09.599 "claimed": false, 00:16:09.599 "zoned": false, 00:16:09.599 "supported_io_types": { 00:16:09.599 "read": true, 00:16:09.599 "write": true, 00:16:09.599 "unmap": true, 00:16:09.599 "flush": true, 00:16:09.599 "reset": true, 00:16:09.599 "nvme_admin": false, 00:16:09.599 "nvme_io": false, 00:16:09.599 "nvme_io_md": false, 00:16:09.599 "write_zeroes": true, 00:16:09.599 "zcopy": true, 00:16:09.599 "get_zone_info": false, 00:16:09.599 "zone_management": false, 00:16:09.599 "zone_append": false, 00:16:09.599 "compare": false, 00:16:09.599 "compare_and_write": false, 00:16:09.599 "abort": true, 00:16:09.599 "seek_hole": false, 00:16:09.599 "seek_data": false, 00:16:09.599 "copy": true, 00:16:09.599 "nvme_iov_md": false 00:16:09.599 }, 00:16:09.599 "memory_domains": [ 00:16:09.599 { 00:16:09.599 "dma_device_id": "system", 00:16:09.599 "dma_device_type": 1 00:16:09.599 }, 00:16:09.599 { 00:16:09.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.599 "dma_device_type": 2 00:16:09.599 } 00:16:09.599 ], 00:16:09.599 "driver_specific": {} 00:16:09.599 } 00:16:09.599 ] 00:16:09.599 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:09.599 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:09.599 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:09.599 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:09.858 BaseBdev3 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:09.858 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.117 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:10.376 [ 00:16:10.376 { 00:16:10.376 "name": "BaseBdev3", 00:16:10.376 "aliases": [ 00:16:10.376 "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b" 00:16:10.376 ], 00:16:10.376 "product_name": "Malloc disk", 00:16:10.376 "block_size": 512, 00:16:10.376 "num_blocks": 65536, 00:16:10.376 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:10.376 "assigned_rate_limits": { 00:16:10.376 "rw_ios_per_sec": 0, 00:16:10.376 "rw_mbytes_per_sec": 0, 00:16:10.376 "r_mbytes_per_sec": 0, 00:16:10.376 "w_mbytes_per_sec": 0 00:16:10.376 }, 00:16:10.376 "claimed": false, 00:16:10.376 "zoned": false, 00:16:10.376 "supported_io_types": { 00:16:10.376 "read": true, 00:16:10.376 "write": true, 00:16:10.376 "unmap": true, 00:16:10.376 "flush": true, 00:16:10.376 "reset": true, 00:16:10.376 "nvme_admin": false, 00:16:10.376 "nvme_io": false, 00:16:10.376 "nvme_io_md": false, 00:16:10.376 "write_zeroes": true, 00:16:10.376 "zcopy": true, 00:16:10.376 "get_zone_info": false, 00:16:10.376 "zone_management": false, 00:16:10.376 "zone_append": false, 00:16:10.376 "compare": false, 00:16:10.376 "compare_and_write": false, 00:16:10.376 "abort": true, 00:16:10.376 "seek_hole": false, 00:16:10.376 "seek_data": false, 00:16:10.376 "copy": true, 00:16:10.376 "nvme_iov_md": false 00:16:10.376 }, 00:16:10.376 "memory_domains": [ 00:16:10.376 { 00:16:10.376 "dma_device_id": "system", 00:16:10.376 "dma_device_type": 1 00:16:10.376 }, 00:16:10.376 { 00:16:10.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.376 "dma_device_type": 2 00:16:10.376 } 00:16:10.376 ], 00:16:10.376 "driver_specific": {} 00:16:10.376 } 00:16:10.376 ] 00:16:10.376 07:21:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:10.376 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:10.376 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:10.376 07:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:10.635 [2024-07-25 07:21:42.983301] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:10.635 [2024-07-25 07:21:42.983337] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:10.635 [2024-07-25 07:21:42.983355] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:10.635 [2024-07-25 07:21:42.984606] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.635 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.894 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.894 "name": "Existed_Raid", 00:16:10.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.894 "strip_size_kb": 64, 00:16:10.894 "state": "configuring", 00:16:10.894 "raid_level": "concat", 00:16:10.894 "superblock": false, 00:16:10.894 "num_base_bdevs": 3, 00:16:10.894 "num_base_bdevs_discovered": 2, 00:16:10.894 "num_base_bdevs_operational": 3, 00:16:10.894 "base_bdevs_list": [ 00:16:10.894 { 00:16:10.894 "name": "BaseBdev1", 00:16:10.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.894 "is_configured": false, 00:16:10.894 "data_offset": 0, 00:16:10.894 "data_size": 0 00:16:10.894 }, 00:16:10.894 { 00:16:10.894 "name": "BaseBdev2", 00:16:10.894 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:10.894 "is_configured": true, 00:16:10.894 "data_offset": 0, 00:16:10.894 "data_size": 65536 00:16:10.894 }, 00:16:10.894 { 00:16:10.894 "name": "BaseBdev3", 00:16:10.894 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:10.894 "is_configured": true, 00:16:10.894 "data_offset": 0, 00:16:10.894 "data_size": 65536 00:16:10.894 } 00:16:10.894 ] 00:16:10.894 }' 00:16:10.894 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.894 07:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.462 07:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:11.462 [2024-07-25 07:21:43.994116] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.721 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.721 "name": "Existed_Raid", 00:16:11.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.721 "strip_size_kb": 64, 00:16:11.721 "state": "configuring", 00:16:11.721 "raid_level": "concat", 00:16:11.721 "superblock": false, 00:16:11.721 "num_base_bdevs": 3, 00:16:11.721 "num_base_bdevs_discovered": 1, 00:16:11.721 "num_base_bdevs_operational": 3, 00:16:11.721 "base_bdevs_list": [ 00:16:11.721 { 00:16:11.721 "name": "BaseBdev1", 00:16:11.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.721 "is_configured": false, 00:16:11.721 "data_offset": 0, 00:16:11.721 "data_size": 0 00:16:11.721 }, 00:16:11.721 { 00:16:11.721 "name": null, 00:16:11.721 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:11.721 "is_configured": false, 00:16:11.722 "data_offset": 0, 00:16:11.722 "data_size": 65536 00:16:11.722 }, 00:16:11.722 { 00:16:11.722 "name": "BaseBdev3", 00:16:11.722 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:11.722 "is_configured": true, 00:16:11.722 "data_offset": 0, 00:16:11.722 "data_size": 65536 00:16:11.722 } 00:16:11.722 ] 00:16:11.722 }' 00:16:11.722 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.722 07:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.289 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.289 07:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:12.547 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:12.547 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:12.806 [2024-07-25 07:21:45.248848] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:12.806 BaseBdev1 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:12.806 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:13.064 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:13.343 [ 00:16:13.343 { 00:16:13.343 "name": "BaseBdev1", 00:16:13.343 "aliases": [ 00:16:13.343 "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21" 00:16:13.343 ], 00:16:13.343 "product_name": "Malloc disk", 00:16:13.343 "block_size": 512, 00:16:13.343 "num_blocks": 65536, 00:16:13.343 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:13.343 "assigned_rate_limits": { 00:16:13.343 "rw_ios_per_sec": 0, 00:16:13.343 "rw_mbytes_per_sec": 0, 00:16:13.343 "r_mbytes_per_sec": 0, 00:16:13.343 "w_mbytes_per_sec": 0 00:16:13.343 }, 00:16:13.343 "claimed": true, 00:16:13.343 "claim_type": "exclusive_write", 00:16:13.343 "zoned": false, 00:16:13.343 "supported_io_types": { 00:16:13.343 "read": true, 00:16:13.343 "write": true, 00:16:13.343 "unmap": true, 00:16:13.343 "flush": true, 00:16:13.343 "reset": true, 00:16:13.343 "nvme_admin": false, 00:16:13.343 "nvme_io": false, 00:16:13.343 "nvme_io_md": false, 00:16:13.343 "write_zeroes": true, 00:16:13.343 "zcopy": true, 00:16:13.343 "get_zone_info": false, 00:16:13.343 "zone_management": false, 00:16:13.343 "zone_append": false, 00:16:13.343 "compare": false, 00:16:13.343 "compare_and_write": false, 00:16:13.343 "abort": true, 00:16:13.343 "seek_hole": false, 00:16:13.343 "seek_data": false, 00:16:13.343 "copy": true, 00:16:13.343 "nvme_iov_md": false 00:16:13.343 }, 00:16:13.343 "memory_domains": [ 00:16:13.343 { 00:16:13.343 "dma_device_id": "system", 00:16:13.343 "dma_device_type": 1 00:16:13.343 }, 00:16:13.343 { 00:16:13.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.343 "dma_device_type": 2 00:16:13.343 } 00:16:13.343 ], 00:16:13.343 "driver_specific": {} 00:16:13.343 } 00:16:13.343 ] 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.343 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.344 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.344 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.619 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.619 "name": "Existed_Raid", 00:16:13.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.619 "strip_size_kb": 64, 00:16:13.619 "state": "configuring", 00:16:13.619 "raid_level": "concat", 00:16:13.619 "superblock": false, 00:16:13.619 "num_base_bdevs": 3, 00:16:13.619 "num_base_bdevs_discovered": 2, 00:16:13.619 "num_base_bdevs_operational": 3, 00:16:13.619 "base_bdevs_list": [ 00:16:13.619 { 00:16:13.619 "name": "BaseBdev1", 00:16:13.619 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:13.619 "is_configured": true, 00:16:13.619 "data_offset": 0, 00:16:13.619 "data_size": 65536 00:16:13.619 }, 00:16:13.619 { 00:16:13.619 "name": null, 00:16:13.619 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:13.619 "is_configured": false, 00:16:13.619 "data_offset": 0, 00:16:13.619 "data_size": 65536 00:16:13.619 }, 00:16:13.619 { 00:16:13.619 "name": "BaseBdev3", 00:16:13.619 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:13.619 "is_configured": true, 00:16:13.619 "data_offset": 0, 00:16:13.619 "data_size": 65536 00:16:13.619 } 00:16:13.619 ] 00:16:13.619 }' 00:16:13.619 07:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.619 07:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.187 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.187 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:14.447 [2024-07-25 07:21:46.937308] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.447 07:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.707 07:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.707 "name": "Existed_Raid", 00:16:14.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.707 "strip_size_kb": 64, 00:16:14.707 "state": "configuring", 00:16:14.707 "raid_level": "concat", 00:16:14.707 "superblock": false, 00:16:14.707 "num_base_bdevs": 3, 00:16:14.707 "num_base_bdevs_discovered": 1, 00:16:14.707 "num_base_bdevs_operational": 3, 00:16:14.707 "base_bdevs_list": [ 00:16:14.707 { 00:16:14.707 "name": "BaseBdev1", 00:16:14.707 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:14.707 "is_configured": true, 00:16:14.707 "data_offset": 0, 00:16:14.707 "data_size": 65536 00:16:14.707 }, 00:16:14.707 { 00:16:14.707 "name": null, 00:16:14.707 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:14.707 "is_configured": false, 00:16:14.707 "data_offset": 0, 00:16:14.707 "data_size": 65536 00:16:14.707 }, 00:16:14.707 { 00:16:14.707 "name": null, 00:16:14.707 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:14.707 "is_configured": false, 00:16:14.707 "data_offset": 0, 00:16:14.707 "data_size": 65536 00:16:14.707 } 00:16:14.707 ] 00:16:14.707 }' 00:16:14.707 07:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.707 07:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.274 07:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.274 07:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:15.533 07:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:15.533 07:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:15.792 [2024-07-25 07:21:48.180605] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.792 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.051 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.051 "name": "Existed_Raid", 00:16:16.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.051 "strip_size_kb": 64, 00:16:16.051 "state": "configuring", 00:16:16.051 "raid_level": "concat", 00:16:16.051 "superblock": false, 00:16:16.051 "num_base_bdevs": 3, 00:16:16.051 "num_base_bdevs_discovered": 2, 00:16:16.051 "num_base_bdevs_operational": 3, 00:16:16.051 "base_bdevs_list": [ 00:16:16.051 { 00:16:16.051 "name": "BaseBdev1", 00:16:16.051 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:16.051 "is_configured": true, 00:16:16.051 "data_offset": 0, 00:16:16.051 "data_size": 65536 00:16:16.051 }, 00:16:16.051 { 00:16:16.051 "name": null, 00:16:16.051 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:16.051 "is_configured": false, 00:16:16.051 "data_offset": 0, 00:16:16.051 "data_size": 65536 00:16:16.051 }, 00:16:16.051 { 00:16:16.051 "name": "BaseBdev3", 00:16:16.051 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:16.051 "is_configured": true, 00:16:16.051 "data_offset": 0, 00:16:16.051 "data_size": 65536 00:16:16.051 } 00:16:16.051 ] 00:16:16.051 }' 00:16:16.051 07:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.051 07:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.619 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.619 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:16.878 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:16.878 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:17.136 [2024-07-25 07:21:49.447957] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.136 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.137 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.137 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.137 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.137 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.137 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.395 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.395 "name": "Existed_Raid", 00:16:17.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.395 "strip_size_kb": 64, 00:16:17.395 "state": "configuring", 00:16:17.395 "raid_level": "concat", 00:16:17.395 "superblock": false, 00:16:17.395 "num_base_bdevs": 3, 00:16:17.395 "num_base_bdevs_discovered": 1, 00:16:17.395 "num_base_bdevs_operational": 3, 00:16:17.395 "base_bdevs_list": [ 00:16:17.395 { 00:16:17.395 "name": null, 00:16:17.395 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:17.395 "is_configured": false, 00:16:17.395 "data_offset": 0, 00:16:17.395 "data_size": 65536 00:16:17.395 }, 00:16:17.395 { 00:16:17.395 "name": null, 00:16:17.395 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:17.395 "is_configured": false, 00:16:17.396 "data_offset": 0, 00:16:17.396 "data_size": 65536 00:16:17.396 }, 00:16:17.396 { 00:16:17.396 "name": "BaseBdev3", 00:16:17.396 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:17.396 "is_configured": true, 00:16:17.396 "data_offset": 0, 00:16:17.396 "data_size": 65536 00:16:17.396 } 00:16:17.396 ] 00:16:17.396 }' 00:16:17.396 07:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.396 07:21:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.963 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.963 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:18.221 [2024-07-25 07:21:50.717288] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.221 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.481 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.481 "name": "Existed_Raid", 00:16:18.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.481 "strip_size_kb": 64, 00:16:18.481 "state": "configuring", 00:16:18.481 "raid_level": "concat", 00:16:18.481 "superblock": false, 00:16:18.481 "num_base_bdevs": 3, 00:16:18.481 "num_base_bdevs_discovered": 2, 00:16:18.481 "num_base_bdevs_operational": 3, 00:16:18.481 "base_bdevs_list": [ 00:16:18.481 { 00:16:18.481 "name": null, 00:16:18.481 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:18.481 "is_configured": false, 00:16:18.481 "data_offset": 0, 00:16:18.481 "data_size": 65536 00:16:18.481 }, 00:16:18.481 { 00:16:18.481 "name": "BaseBdev2", 00:16:18.481 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:18.481 "is_configured": true, 00:16:18.481 "data_offset": 0, 00:16:18.481 "data_size": 65536 00:16:18.481 }, 00:16:18.481 { 00:16:18.481 "name": "BaseBdev3", 00:16:18.481 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:18.481 "is_configured": true, 00:16:18.481 "data_offset": 0, 00:16:18.481 "data_size": 65536 00:16:18.481 } 00:16:18.481 ] 00:16:18.481 }' 00:16:18.481 07:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.481 07:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.048 07:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.048 07:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:19.307 07:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:19.307 07:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.307 07:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:19.565 07:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21 00:16:19.824 [2024-07-25 07:21:52.188346] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:19.824 [2024-07-25 07:21:52.188379] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15b0680 00:16:19.824 [2024-07-25 07:21:52.188387] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:19.824 [2024-07-25 07:21:52.188565] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15b1e90 00:16:19.824 [2024-07-25 07:21:52.188667] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15b0680 00:16:19.824 [2024-07-25 07:21:52.188676] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15b0680 00:16:19.824 [2024-07-25 07:21:52.188820] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:19.824 NewBaseBdev 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:19.824 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:20.083 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:20.342 [ 00:16:20.342 { 00:16:20.342 "name": "NewBaseBdev", 00:16:20.342 "aliases": [ 00:16:20.342 "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21" 00:16:20.342 ], 00:16:20.342 "product_name": "Malloc disk", 00:16:20.342 "block_size": 512, 00:16:20.342 "num_blocks": 65536, 00:16:20.342 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:20.342 "assigned_rate_limits": { 00:16:20.342 "rw_ios_per_sec": 0, 00:16:20.342 "rw_mbytes_per_sec": 0, 00:16:20.342 "r_mbytes_per_sec": 0, 00:16:20.342 "w_mbytes_per_sec": 0 00:16:20.342 }, 00:16:20.342 "claimed": true, 00:16:20.342 "claim_type": "exclusive_write", 00:16:20.342 "zoned": false, 00:16:20.342 "supported_io_types": { 00:16:20.342 "read": true, 00:16:20.342 "write": true, 00:16:20.342 "unmap": true, 00:16:20.342 "flush": true, 00:16:20.342 "reset": true, 00:16:20.342 "nvme_admin": false, 00:16:20.342 "nvme_io": false, 00:16:20.342 "nvme_io_md": false, 00:16:20.342 "write_zeroes": true, 00:16:20.342 "zcopy": true, 00:16:20.342 "get_zone_info": false, 00:16:20.342 "zone_management": false, 00:16:20.342 "zone_append": false, 00:16:20.342 "compare": false, 00:16:20.342 "compare_and_write": false, 00:16:20.342 "abort": true, 00:16:20.342 "seek_hole": false, 00:16:20.342 "seek_data": false, 00:16:20.342 "copy": true, 00:16:20.342 "nvme_iov_md": false 00:16:20.342 }, 00:16:20.342 "memory_domains": [ 00:16:20.342 { 00:16:20.342 "dma_device_id": "system", 00:16:20.342 "dma_device_type": 1 00:16:20.342 }, 00:16:20.342 { 00:16:20.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.342 "dma_device_type": 2 00:16:20.342 } 00:16:20.342 ], 00:16:20.342 "driver_specific": {} 00:16:20.342 } 00:16:20.342 ] 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.342 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.601 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.601 "name": "Existed_Raid", 00:16:20.601 "uuid": "719daef3-c6b4-408b-a5c6-000e961e460a", 00:16:20.601 "strip_size_kb": 64, 00:16:20.601 "state": "online", 00:16:20.601 "raid_level": "concat", 00:16:20.601 "superblock": false, 00:16:20.601 "num_base_bdevs": 3, 00:16:20.601 "num_base_bdevs_discovered": 3, 00:16:20.601 "num_base_bdevs_operational": 3, 00:16:20.601 "base_bdevs_list": [ 00:16:20.601 { 00:16:20.601 "name": "NewBaseBdev", 00:16:20.601 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:20.601 "is_configured": true, 00:16:20.601 "data_offset": 0, 00:16:20.601 "data_size": 65536 00:16:20.601 }, 00:16:20.601 { 00:16:20.601 "name": "BaseBdev2", 00:16:20.601 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:20.601 "is_configured": true, 00:16:20.601 "data_offset": 0, 00:16:20.601 "data_size": 65536 00:16:20.601 }, 00:16:20.601 { 00:16:20.601 "name": "BaseBdev3", 00:16:20.601 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:20.601 "is_configured": true, 00:16:20.601 "data_offset": 0, 00:16:20.601 "data_size": 65536 00:16:20.601 } 00:16:20.601 ] 00:16:20.601 }' 00:16:20.601 07:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.601 07:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.168 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:21.169 [2024-07-25 07:21:53.672526] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:21.169 "name": "Existed_Raid", 00:16:21.169 "aliases": [ 00:16:21.169 "719daef3-c6b4-408b-a5c6-000e961e460a" 00:16:21.169 ], 00:16:21.169 "product_name": "Raid Volume", 00:16:21.169 "block_size": 512, 00:16:21.169 "num_blocks": 196608, 00:16:21.169 "uuid": "719daef3-c6b4-408b-a5c6-000e961e460a", 00:16:21.169 "assigned_rate_limits": { 00:16:21.169 "rw_ios_per_sec": 0, 00:16:21.169 "rw_mbytes_per_sec": 0, 00:16:21.169 "r_mbytes_per_sec": 0, 00:16:21.169 "w_mbytes_per_sec": 0 00:16:21.169 }, 00:16:21.169 "claimed": false, 00:16:21.169 "zoned": false, 00:16:21.169 "supported_io_types": { 00:16:21.169 "read": true, 00:16:21.169 "write": true, 00:16:21.169 "unmap": true, 00:16:21.169 "flush": true, 00:16:21.169 "reset": true, 00:16:21.169 "nvme_admin": false, 00:16:21.169 "nvme_io": false, 00:16:21.169 "nvme_io_md": false, 00:16:21.169 "write_zeroes": true, 00:16:21.169 "zcopy": false, 00:16:21.169 "get_zone_info": false, 00:16:21.169 "zone_management": false, 00:16:21.169 "zone_append": false, 00:16:21.169 "compare": false, 00:16:21.169 "compare_and_write": false, 00:16:21.169 "abort": false, 00:16:21.169 "seek_hole": false, 00:16:21.169 "seek_data": false, 00:16:21.169 "copy": false, 00:16:21.169 "nvme_iov_md": false 00:16:21.169 }, 00:16:21.169 "memory_domains": [ 00:16:21.169 { 00:16:21.169 "dma_device_id": "system", 00:16:21.169 "dma_device_type": 1 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.169 "dma_device_type": 2 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "dma_device_id": "system", 00:16:21.169 "dma_device_type": 1 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.169 "dma_device_type": 2 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "dma_device_id": "system", 00:16:21.169 "dma_device_type": 1 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.169 "dma_device_type": 2 00:16:21.169 } 00:16:21.169 ], 00:16:21.169 "driver_specific": { 00:16:21.169 "raid": { 00:16:21.169 "uuid": "719daef3-c6b4-408b-a5c6-000e961e460a", 00:16:21.169 "strip_size_kb": 64, 00:16:21.169 "state": "online", 00:16:21.169 "raid_level": "concat", 00:16:21.169 "superblock": false, 00:16:21.169 "num_base_bdevs": 3, 00:16:21.169 "num_base_bdevs_discovered": 3, 00:16:21.169 "num_base_bdevs_operational": 3, 00:16:21.169 "base_bdevs_list": [ 00:16:21.169 { 00:16:21.169 "name": "NewBaseBdev", 00:16:21.169 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:21.169 "is_configured": true, 00:16:21.169 "data_offset": 0, 00:16:21.169 "data_size": 65536 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "name": "BaseBdev2", 00:16:21.169 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:21.169 "is_configured": true, 00:16:21.169 "data_offset": 0, 00:16:21.169 "data_size": 65536 00:16:21.169 }, 00:16:21.169 { 00:16:21.169 "name": "BaseBdev3", 00:16:21.169 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:21.169 "is_configured": true, 00:16:21.169 "data_offset": 0, 00:16:21.169 "data_size": 65536 00:16:21.169 } 00:16:21.169 ] 00:16:21.169 } 00:16:21.169 } 00:16:21.169 }' 00:16:21.169 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:21.427 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:21.427 BaseBdev2 00:16:21.427 BaseBdev3' 00:16:21.428 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:21.428 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:21.428 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.687 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.687 "name": "NewBaseBdev", 00:16:21.687 "aliases": [ 00:16:21.687 "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21" 00:16:21.687 ], 00:16:21.687 "product_name": "Malloc disk", 00:16:21.687 "block_size": 512, 00:16:21.687 "num_blocks": 65536, 00:16:21.687 "uuid": "a181a3d7-2aa6-41a4-a3c6-d3cc671cfc21", 00:16:21.687 "assigned_rate_limits": { 00:16:21.687 "rw_ios_per_sec": 0, 00:16:21.687 "rw_mbytes_per_sec": 0, 00:16:21.687 "r_mbytes_per_sec": 0, 00:16:21.687 "w_mbytes_per_sec": 0 00:16:21.687 }, 00:16:21.687 "claimed": true, 00:16:21.687 "claim_type": "exclusive_write", 00:16:21.687 "zoned": false, 00:16:21.687 "supported_io_types": { 00:16:21.687 "read": true, 00:16:21.687 "write": true, 00:16:21.687 "unmap": true, 00:16:21.687 "flush": true, 00:16:21.687 "reset": true, 00:16:21.687 "nvme_admin": false, 00:16:21.687 "nvme_io": false, 00:16:21.687 "nvme_io_md": false, 00:16:21.687 "write_zeroes": true, 00:16:21.687 "zcopy": true, 00:16:21.687 "get_zone_info": false, 00:16:21.687 "zone_management": false, 00:16:21.687 "zone_append": false, 00:16:21.687 "compare": false, 00:16:21.687 "compare_and_write": false, 00:16:21.687 "abort": true, 00:16:21.687 "seek_hole": false, 00:16:21.687 "seek_data": false, 00:16:21.687 "copy": true, 00:16:21.687 "nvme_iov_md": false 00:16:21.687 }, 00:16:21.687 "memory_domains": [ 00:16:21.687 { 00:16:21.687 "dma_device_id": "system", 00:16:21.687 "dma_device_type": 1 00:16:21.687 }, 00:16:21.687 { 00:16:21.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.687 "dma_device_type": 2 00:16:21.687 } 00:16:21.687 ], 00:16:21.687 "driver_specific": {} 00:16:21.688 }' 00:16:21.688 07:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.688 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:21.947 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.207 "name": "BaseBdev2", 00:16:22.207 "aliases": [ 00:16:22.207 "616c7e1b-4b08-4d9b-baf6-0888932debcf" 00:16:22.207 ], 00:16:22.207 "product_name": "Malloc disk", 00:16:22.207 "block_size": 512, 00:16:22.207 "num_blocks": 65536, 00:16:22.207 "uuid": "616c7e1b-4b08-4d9b-baf6-0888932debcf", 00:16:22.207 "assigned_rate_limits": { 00:16:22.207 "rw_ios_per_sec": 0, 00:16:22.207 "rw_mbytes_per_sec": 0, 00:16:22.207 "r_mbytes_per_sec": 0, 00:16:22.207 "w_mbytes_per_sec": 0 00:16:22.207 }, 00:16:22.207 "claimed": true, 00:16:22.207 "claim_type": "exclusive_write", 00:16:22.207 "zoned": false, 00:16:22.207 "supported_io_types": { 00:16:22.207 "read": true, 00:16:22.207 "write": true, 00:16:22.207 "unmap": true, 00:16:22.207 "flush": true, 00:16:22.207 "reset": true, 00:16:22.207 "nvme_admin": false, 00:16:22.207 "nvme_io": false, 00:16:22.207 "nvme_io_md": false, 00:16:22.207 "write_zeroes": true, 00:16:22.207 "zcopy": true, 00:16:22.207 "get_zone_info": false, 00:16:22.207 "zone_management": false, 00:16:22.207 "zone_append": false, 00:16:22.207 "compare": false, 00:16:22.207 "compare_and_write": false, 00:16:22.207 "abort": true, 00:16:22.207 "seek_hole": false, 00:16:22.207 "seek_data": false, 00:16:22.207 "copy": true, 00:16:22.207 "nvme_iov_md": false 00:16:22.207 }, 00:16:22.207 "memory_domains": [ 00:16:22.207 { 00:16:22.207 "dma_device_id": "system", 00:16:22.207 "dma_device_type": 1 00:16:22.207 }, 00:16:22.207 { 00:16:22.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.207 "dma_device_type": 2 00:16:22.207 } 00:16:22.207 ], 00:16:22.207 "driver_specific": {} 00:16:22.207 }' 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.207 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:22.466 07:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.725 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.725 "name": "BaseBdev3", 00:16:22.725 "aliases": [ 00:16:22.725 "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b" 00:16:22.725 ], 00:16:22.725 "product_name": "Malloc disk", 00:16:22.725 "block_size": 512, 00:16:22.725 "num_blocks": 65536, 00:16:22.725 "uuid": "943dc4e2-b906-443d-9c8e-3a5e75ea6b4b", 00:16:22.725 "assigned_rate_limits": { 00:16:22.725 "rw_ios_per_sec": 0, 00:16:22.725 "rw_mbytes_per_sec": 0, 00:16:22.725 "r_mbytes_per_sec": 0, 00:16:22.725 "w_mbytes_per_sec": 0 00:16:22.725 }, 00:16:22.725 "claimed": true, 00:16:22.725 "claim_type": "exclusive_write", 00:16:22.725 "zoned": false, 00:16:22.725 "supported_io_types": { 00:16:22.725 "read": true, 00:16:22.725 "write": true, 00:16:22.725 "unmap": true, 00:16:22.725 "flush": true, 00:16:22.725 "reset": true, 00:16:22.725 "nvme_admin": false, 00:16:22.725 "nvme_io": false, 00:16:22.725 "nvme_io_md": false, 00:16:22.725 "write_zeroes": true, 00:16:22.725 "zcopy": true, 00:16:22.725 "get_zone_info": false, 00:16:22.725 "zone_management": false, 00:16:22.725 "zone_append": false, 00:16:22.725 "compare": false, 00:16:22.725 "compare_and_write": false, 00:16:22.725 "abort": true, 00:16:22.725 "seek_hole": false, 00:16:22.725 "seek_data": false, 00:16:22.725 "copy": true, 00:16:22.725 "nvme_iov_md": false 00:16:22.725 }, 00:16:22.725 "memory_domains": [ 00:16:22.725 { 00:16:22.725 "dma_device_id": "system", 00:16:22.725 "dma_device_type": 1 00:16:22.725 }, 00:16:22.725 { 00:16:22.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.725 "dma_device_type": 2 00:16:22.725 } 00:16:22.725 ], 00:16:22.725 "driver_specific": {} 00:16:22.725 }' 00:16:22.725 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.725 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.725 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.725 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.725 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.984 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:23.244 [2024-07-25 07:21:55.677697] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:23.244 [2024-07-25 07:21:55.677719] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:23.244 [2024-07-25 07:21:55.677767] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:23.244 [2024-07-25 07:21:55.677815] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:23.244 [2024-07-25 07:21:55.677826] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b0680 name Existed_Raid, state offline 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1627884 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1627884 ']' 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1627884 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1627884 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1627884' 00:16:23.244 killing process with pid 1627884 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1627884 00:16:23.244 [2024-07-25 07:21:55.754415] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:23.244 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1627884 00:16:23.244 [2024-07-25 07:21:55.777565] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:23.503 07:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:23.503 00:16:23.503 real 0m26.736s 00:16:23.503 user 0m49.153s 00:16:23.503 sys 0m4.795s 00:16:23.503 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:23.503 07:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.503 ************************************ 00:16:23.503 END TEST raid_state_function_test 00:16:23.503 ************************************ 00:16:23.503 07:21:56 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:16:23.503 07:21:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:23.503 07:21:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:23.503 07:21:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:23.763 ************************************ 00:16:23.763 START TEST raid_state_function_test_sb 00:16:23.763 ************************************ 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 true 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1632979 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1632979' 00:16:23.763 Process raid pid: 1632979 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1632979 /var/tmp/spdk-raid.sock 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1632979 ']' 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:23.763 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:23.764 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:23.764 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:23.764 07:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:23.764 [2024-07-25 07:21:56.114016] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:16:23.764 [2024-07-25 07:21:56.114073] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.0 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.1 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.2 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.3 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.4 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.5 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.6 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:01.7 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.0 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.1 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.2 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.3 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.4 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.5 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.6 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3d:02.7 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.0 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.1 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.2 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.3 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.4 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.5 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.6 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:01.7 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.0 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.1 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.2 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.3 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.4 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.5 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.6 cannot be used 00:16:23.764 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:23.764 EAL: Requested device 0000:3f:02.7 cannot be used 00:16:23.764 [2024-07-25 07:21:56.246048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:24.023 [2024-07-25 07:21:56.332567] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:24.023 [2024-07-25 07:21:56.386457] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:24.023 [2024-07-25 07:21:56.386484] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:24.591 07:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:24.591 07:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:16:24.591 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:24.850 [2024-07-25 07:21:57.223966] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:24.850 [2024-07-25 07:21:57.224002] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:24.850 [2024-07-25 07:21:57.224013] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:24.850 [2024-07-25 07:21:57.224023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:24.850 [2024-07-25 07:21:57.224031] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:24.850 [2024-07-25 07:21:57.224041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:24.850 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.851 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.851 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.851 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.851 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.851 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.109 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.109 "name": "Existed_Raid", 00:16:25.109 "uuid": "d6fe91ad-5574-4e0a-b8f0-014b5067b64e", 00:16:25.109 "strip_size_kb": 64, 00:16:25.109 "state": "configuring", 00:16:25.109 "raid_level": "concat", 00:16:25.109 "superblock": true, 00:16:25.109 "num_base_bdevs": 3, 00:16:25.109 "num_base_bdevs_discovered": 0, 00:16:25.109 "num_base_bdevs_operational": 3, 00:16:25.109 "base_bdevs_list": [ 00:16:25.109 { 00:16:25.109 "name": "BaseBdev1", 00:16:25.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.109 "is_configured": false, 00:16:25.109 "data_offset": 0, 00:16:25.109 "data_size": 0 00:16:25.109 }, 00:16:25.109 { 00:16:25.109 "name": "BaseBdev2", 00:16:25.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.109 "is_configured": false, 00:16:25.109 "data_offset": 0, 00:16:25.109 "data_size": 0 00:16:25.109 }, 00:16:25.109 { 00:16:25.109 "name": "BaseBdev3", 00:16:25.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.109 "is_configured": false, 00:16:25.109 "data_offset": 0, 00:16:25.110 "data_size": 0 00:16:25.110 } 00:16:25.110 ] 00:16:25.110 }' 00:16:25.110 07:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.110 07:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:25.677 07:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:25.935 [2024-07-25 07:21:58.250535] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:25.935 [2024-07-25 07:21:58.250561] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf97ec0 name Existed_Raid, state configuring 00:16:25.935 07:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:26.194 [2024-07-25 07:21:58.479157] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:26.194 [2024-07-25 07:21:58.479180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:26.194 [2024-07-25 07:21:58.479190] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:26.194 [2024-07-25 07:21:58.479200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:26.194 [2024-07-25 07:21:58.479208] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:26.194 [2024-07-25 07:21:58.479218] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:26.195 [2024-07-25 07:21:58.709117] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:26.195 BaseBdev1 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:26.195 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:26.462 07:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:26.721 [ 00:16:26.721 { 00:16:26.721 "name": "BaseBdev1", 00:16:26.721 "aliases": [ 00:16:26.721 "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090" 00:16:26.721 ], 00:16:26.721 "product_name": "Malloc disk", 00:16:26.721 "block_size": 512, 00:16:26.721 "num_blocks": 65536, 00:16:26.721 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:26.721 "assigned_rate_limits": { 00:16:26.721 "rw_ios_per_sec": 0, 00:16:26.721 "rw_mbytes_per_sec": 0, 00:16:26.721 "r_mbytes_per_sec": 0, 00:16:26.721 "w_mbytes_per_sec": 0 00:16:26.721 }, 00:16:26.721 "claimed": true, 00:16:26.721 "claim_type": "exclusive_write", 00:16:26.721 "zoned": false, 00:16:26.721 "supported_io_types": { 00:16:26.721 "read": true, 00:16:26.721 "write": true, 00:16:26.721 "unmap": true, 00:16:26.721 "flush": true, 00:16:26.721 "reset": true, 00:16:26.721 "nvme_admin": false, 00:16:26.721 "nvme_io": false, 00:16:26.721 "nvme_io_md": false, 00:16:26.721 "write_zeroes": true, 00:16:26.721 "zcopy": true, 00:16:26.721 "get_zone_info": false, 00:16:26.721 "zone_management": false, 00:16:26.721 "zone_append": false, 00:16:26.721 "compare": false, 00:16:26.721 "compare_and_write": false, 00:16:26.721 "abort": true, 00:16:26.721 "seek_hole": false, 00:16:26.721 "seek_data": false, 00:16:26.721 "copy": true, 00:16:26.721 "nvme_iov_md": false 00:16:26.721 }, 00:16:26.721 "memory_domains": [ 00:16:26.721 { 00:16:26.721 "dma_device_id": "system", 00:16:26.721 "dma_device_type": 1 00:16:26.721 }, 00:16:26.721 { 00:16:26.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.721 "dma_device_type": 2 00:16:26.721 } 00:16:26.721 ], 00:16:26.721 "driver_specific": {} 00:16:26.721 } 00:16:26.721 ] 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.721 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.980 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.980 "name": "Existed_Raid", 00:16:26.980 "uuid": "2beea0d9-ef55-4119-8b6b-5b9124abcd6a", 00:16:26.980 "strip_size_kb": 64, 00:16:26.980 "state": "configuring", 00:16:26.980 "raid_level": "concat", 00:16:26.980 "superblock": true, 00:16:26.980 "num_base_bdevs": 3, 00:16:26.980 "num_base_bdevs_discovered": 1, 00:16:26.980 "num_base_bdevs_operational": 3, 00:16:26.980 "base_bdevs_list": [ 00:16:26.980 { 00:16:26.980 "name": "BaseBdev1", 00:16:26.980 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:26.980 "is_configured": true, 00:16:26.980 "data_offset": 2048, 00:16:26.980 "data_size": 63488 00:16:26.980 }, 00:16:26.980 { 00:16:26.980 "name": "BaseBdev2", 00:16:26.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.980 "is_configured": false, 00:16:26.980 "data_offset": 0, 00:16:26.980 "data_size": 0 00:16:26.980 }, 00:16:26.980 { 00:16:26.980 "name": "BaseBdev3", 00:16:26.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.980 "is_configured": false, 00:16:26.980 "data_offset": 0, 00:16:26.980 "data_size": 0 00:16:26.980 } 00:16:26.980 ] 00:16:26.980 }' 00:16:26.980 07:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.980 07:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:27.547 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:27.805 [2024-07-25 07:22:00.213079] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:27.805 [2024-07-25 07:22:00.213116] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf97790 name Existed_Raid, state configuring 00:16:27.805 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:28.064 [2024-07-25 07:22:00.441718] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:28.064 [2024-07-25 07:22:00.443084] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:28.064 [2024-07-25 07:22:00.443118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:28.064 [2024-07-25 07:22:00.443128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:28.064 [2024-07-25 07:22:00.443149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.064 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.323 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.323 "name": "Existed_Raid", 00:16:28.323 "uuid": "bacdb68b-8b4a-4431-84f8-e413560e17e5", 00:16:28.323 "strip_size_kb": 64, 00:16:28.323 "state": "configuring", 00:16:28.323 "raid_level": "concat", 00:16:28.323 "superblock": true, 00:16:28.323 "num_base_bdevs": 3, 00:16:28.323 "num_base_bdevs_discovered": 1, 00:16:28.323 "num_base_bdevs_operational": 3, 00:16:28.323 "base_bdevs_list": [ 00:16:28.323 { 00:16:28.323 "name": "BaseBdev1", 00:16:28.323 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:28.323 "is_configured": true, 00:16:28.323 "data_offset": 2048, 00:16:28.323 "data_size": 63488 00:16:28.323 }, 00:16:28.323 { 00:16:28.323 "name": "BaseBdev2", 00:16:28.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.323 "is_configured": false, 00:16:28.323 "data_offset": 0, 00:16:28.323 "data_size": 0 00:16:28.323 }, 00:16:28.323 { 00:16:28.323 "name": "BaseBdev3", 00:16:28.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.323 "is_configured": false, 00:16:28.323 "data_offset": 0, 00:16:28.323 "data_size": 0 00:16:28.323 } 00:16:28.323 ] 00:16:28.323 }' 00:16:28.323 07:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.323 07:22:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.890 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:29.149 [2024-07-25 07:22:01.467609] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:29.149 BaseBdev2 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:29.149 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:29.408 [ 00:16:29.408 { 00:16:29.408 "name": "BaseBdev2", 00:16:29.408 "aliases": [ 00:16:29.408 "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4" 00:16:29.408 ], 00:16:29.408 "product_name": "Malloc disk", 00:16:29.408 "block_size": 512, 00:16:29.408 "num_blocks": 65536, 00:16:29.408 "uuid": "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4", 00:16:29.408 "assigned_rate_limits": { 00:16:29.408 "rw_ios_per_sec": 0, 00:16:29.408 "rw_mbytes_per_sec": 0, 00:16:29.408 "r_mbytes_per_sec": 0, 00:16:29.408 "w_mbytes_per_sec": 0 00:16:29.408 }, 00:16:29.408 "claimed": true, 00:16:29.408 "claim_type": "exclusive_write", 00:16:29.408 "zoned": false, 00:16:29.408 "supported_io_types": { 00:16:29.408 "read": true, 00:16:29.408 "write": true, 00:16:29.408 "unmap": true, 00:16:29.408 "flush": true, 00:16:29.408 "reset": true, 00:16:29.408 "nvme_admin": false, 00:16:29.408 "nvme_io": false, 00:16:29.408 "nvme_io_md": false, 00:16:29.408 "write_zeroes": true, 00:16:29.408 "zcopy": true, 00:16:29.408 "get_zone_info": false, 00:16:29.408 "zone_management": false, 00:16:29.408 "zone_append": false, 00:16:29.408 "compare": false, 00:16:29.408 "compare_and_write": false, 00:16:29.408 "abort": true, 00:16:29.408 "seek_hole": false, 00:16:29.408 "seek_data": false, 00:16:29.408 "copy": true, 00:16:29.408 "nvme_iov_md": false 00:16:29.408 }, 00:16:29.408 "memory_domains": [ 00:16:29.408 { 00:16:29.408 "dma_device_id": "system", 00:16:29.408 "dma_device_type": 1 00:16:29.408 }, 00:16:29.408 { 00:16:29.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.408 "dma_device_type": 2 00:16:29.408 } 00:16:29.408 ], 00:16:29.408 "driver_specific": {} 00:16:29.408 } 00:16:29.408 ] 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.408 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.409 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.409 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.409 07:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.667 07:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.667 "name": "Existed_Raid", 00:16:29.667 "uuid": "bacdb68b-8b4a-4431-84f8-e413560e17e5", 00:16:29.667 "strip_size_kb": 64, 00:16:29.667 "state": "configuring", 00:16:29.667 "raid_level": "concat", 00:16:29.667 "superblock": true, 00:16:29.667 "num_base_bdevs": 3, 00:16:29.667 "num_base_bdevs_discovered": 2, 00:16:29.667 "num_base_bdevs_operational": 3, 00:16:29.667 "base_bdevs_list": [ 00:16:29.667 { 00:16:29.667 "name": "BaseBdev1", 00:16:29.667 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:29.667 "is_configured": true, 00:16:29.667 "data_offset": 2048, 00:16:29.667 "data_size": 63488 00:16:29.667 }, 00:16:29.667 { 00:16:29.667 "name": "BaseBdev2", 00:16:29.667 "uuid": "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4", 00:16:29.667 "is_configured": true, 00:16:29.667 "data_offset": 2048, 00:16:29.667 "data_size": 63488 00:16:29.667 }, 00:16:29.667 { 00:16:29.667 "name": "BaseBdev3", 00:16:29.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.667 "is_configured": false, 00:16:29.667 "data_offset": 0, 00:16:29.667 "data_size": 0 00:16:29.667 } 00:16:29.667 ] 00:16:29.667 }' 00:16:29.667 07:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.667 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.235 07:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:30.494 [2024-07-25 07:22:02.946578] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:30.494 [2024-07-25 07:22:02.946718] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf98680 00:16:30.494 [2024-07-25 07:22:02.946731] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:30.494 [2024-07-25 07:22:02.946888] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf98350 00:16:30.494 [2024-07-25 07:22:02.947009] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf98680 00:16:30.494 [2024-07-25 07:22:02.947018] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf98680 00:16:30.494 [2024-07-25 07:22:02.947101] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:30.494 BaseBdev3 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:30.494 07:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.752 07:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:31.010 [ 00:16:31.010 { 00:16:31.010 "name": "BaseBdev3", 00:16:31.010 "aliases": [ 00:16:31.010 "57b94508-a3da-4ada-ae85-9134ab2f0e22" 00:16:31.010 ], 00:16:31.010 "product_name": "Malloc disk", 00:16:31.010 "block_size": 512, 00:16:31.010 "num_blocks": 65536, 00:16:31.010 "uuid": "57b94508-a3da-4ada-ae85-9134ab2f0e22", 00:16:31.010 "assigned_rate_limits": { 00:16:31.010 "rw_ios_per_sec": 0, 00:16:31.010 "rw_mbytes_per_sec": 0, 00:16:31.010 "r_mbytes_per_sec": 0, 00:16:31.010 "w_mbytes_per_sec": 0 00:16:31.010 }, 00:16:31.010 "claimed": true, 00:16:31.010 "claim_type": "exclusive_write", 00:16:31.010 "zoned": false, 00:16:31.010 "supported_io_types": { 00:16:31.010 "read": true, 00:16:31.010 "write": true, 00:16:31.010 "unmap": true, 00:16:31.010 "flush": true, 00:16:31.010 "reset": true, 00:16:31.010 "nvme_admin": false, 00:16:31.010 "nvme_io": false, 00:16:31.010 "nvme_io_md": false, 00:16:31.010 "write_zeroes": true, 00:16:31.010 "zcopy": true, 00:16:31.010 "get_zone_info": false, 00:16:31.010 "zone_management": false, 00:16:31.010 "zone_append": false, 00:16:31.010 "compare": false, 00:16:31.010 "compare_and_write": false, 00:16:31.010 "abort": true, 00:16:31.010 "seek_hole": false, 00:16:31.010 "seek_data": false, 00:16:31.010 "copy": true, 00:16:31.010 "nvme_iov_md": false 00:16:31.010 }, 00:16:31.010 "memory_domains": [ 00:16:31.010 { 00:16:31.010 "dma_device_id": "system", 00:16:31.010 "dma_device_type": 1 00:16:31.010 }, 00:16:31.010 { 00:16:31.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.010 "dma_device_type": 2 00:16:31.010 } 00:16:31.010 ], 00:16:31.010 "driver_specific": {} 00:16:31.010 } 00:16:31.010 ] 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.010 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.269 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.269 "name": "Existed_Raid", 00:16:31.269 "uuid": "bacdb68b-8b4a-4431-84f8-e413560e17e5", 00:16:31.269 "strip_size_kb": 64, 00:16:31.269 "state": "online", 00:16:31.269 "raid_level": "concat", 00:16:31.269 "superblock": true, 00:16:31.269 "num_base_bdevs": 3, 00:16:31.269 "num_base_bdevs_discovered": 3, 00:16:31.269 "num_base_bdevs_operational": 3, 00:16:31.269 "base_bdevs_list": [ 00:16:31.269 { 00:16:31.269 "name": "BaseBdev1", 00:16:31.269 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:31.269 "is_configured": true, 00:16:31.269 "data_offset": 2048, 00:16:31.269 "data_size": 63488 00:16:31.269 }, 00:16:31.269 { 00:16:31.269 "name": "BaseBdev2", 00:16:31.269 "uuid": "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4", 00:16:31.269 "is_configured": true, 00:16:31.269 "data_offset": 2048, 00:16:31.269 "data_size": 63488 00:16:31.269 }, 00:16:31.269 { 00:16:31.269 "name": "BaseBdev3", 00:16:31.269 "uuid": "57b94508-a3da-4ada-ae85-9134ab2f0e22", 00:16:31.269 "is_configured": true, 00:16:31.269 "data_offset": 2048, 00:16:31.269 "data_size": 63488 00:16:31.269 } 00:16:31.269 ] 00:16:31.269 }' 00:16:31.269 07:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.269 07:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:31.837 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:32.097 [2024-07-25 07:22:04.422745] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:32.097 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:32.097 "name": "Existed_Raid", 00:16:32.097 "aliases": [ 00:16:32.097 "bacdb68b-8b4a-4431-84f8-e413560e17e5" 00:16:32.097 ], 00:16:32.097 "product_name": "Raid Volume", 00:16:32.097 "block_size": 512, 00:16:32.097 "num_blocks": 190464, 00:16:32.097 "uuid": "bacdb68b-8b4a-4431-84f8-e413560e17e5", 00:16:32.097 "assigned_rate_limits": { 00:16:32.097 "rw_ios_per_sec": 0, 00:16:32.097 "rw_mbytes_per_sec": 0, 00:16:32.097 "r_mbytes_per_sec": 0, 00:16:32.097 "w_mbytes_per_sec": 0 00:16:32.097 }, 00:16:32.097 "claimed": false, 00:16:32.097 "zoned": false, 00:16:32.097 "supported_io_types": { 00:16:32.097 "read": true, 00:16:32.097 "write": true, 00:16:32.097 "unmap": true, 00:16:32.097 "flush": true, 00:16:32.097 "reset": true, 00:16:32.097 "nvme_admin": false, 00:16:32.097 "nvme_io": false, 00:16:32.097 "nvme_io_md": false, 00:16:32.097 "write_zeroes": true, 00:16:32.097 "zcopy": false, 00:16:32.097 "get_zone_info": false, 00:16:32.097 "zone_management": false, 00:16:32.097 "zone_append": false, 00:16:32.097 "compare": false, 00:16:32.097 "compare_and_write": false, 00:16:32.097 "abort": false, 00:16:32.097 "seek_hole": false, 00:16:32.097 "seek_data": false, 00:16:32.097 "copy": false, 00:16:32.097 "nvme_iov_md": false 00:16:32.097 }, 00:16:32.097 "memory_domains": [ 00:16:32.097 { 00:16:32.097 "dma_device_id": "system", 00:16:32.097 "dma_device_type": 1 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.097 "dma_device_type": 2 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "dma_device_id": "system", 00:16:32.097 "dma_device_type": 1 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.097 "dma_device_type": 2 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "dma_device_id": "system", 00:16:32.097 "dma_device_type": 1 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.097 "dma_device_type": 2 00:16:32.097 } 00:16:32.097 ], 00:16:32.097 "driver_specific": { 00:16:32.097 "raid": { 00:16:32.097 "uuid": "bacdb68b-8b4a-4431-84f8-e413560e17e5", 00:16:32.097 "strip_size_kb": 64, 00:16:32.097 "state": "online", 00:16:32.097 "raid_level": "concat", 00:16:32.097 "superblock": true, 00:16:32.097 "num_base_bdevs": 3, 00:16:32.097 "num_base_bdevs_discovered": 3, 00:16:32.097 "num_base_bdevs_operational": 3, 00:16:32.097 "base_bdevs_list": [ 00:16:32.097 { 00:16:32.097 "name": "BaseBdev1", 00:16:32.097 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:32.097 "is_configured": true, 00:16:32.097 "data_offset": 2048, 00:16:32.097 "data_size": 63488 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "name": "BaseBdev2", 00:16:32.097 "uuid": "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4", 00:16:32.097 "is_configured": true, 00:16:32.097 "data_offset": 2048, 00:16:32.097 "data_size": 63488 00:16:32.097 }, 00:16:32.097 { 00:16:32.097 "name": "BaseBdev3", 00:16:32.097 "uuid": "57b94508-a3da-4ada-ae85-9134ab2f0e22", 00:16:32.097 "is_configured": true, 00:16:32.097 "data_offset": 2048, 00:16:32.097 "data_size": 63488 00:16:32.097 } 00:16:32.097 ] 00:16:32.097 } 00:16:32.097 } 00:16:32.097 }' 00:16:32.097 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:32.097 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:32.097 BaseBdev2 00:16:32.097 BaseBdev3' 00:16:32.097 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.097 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:32.097 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.356 "name": "BaseBdev1", 00:16:32.356 "aliases": [ 00:16:32.356 "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090" 00:16:32.356 ], 00:16:32.356 "product_name": "Malloc disk", 00:16:32.356 "block_size": 512, 00:16:32.356 "num_blocks": 65536, 00:16:32.356 "uuid": "dfc6e32d-f04a-44ae-89ad-ab7e5cf53090", 00:16:32.356 "assigned_rate_limits": { 00:16:32.356 "rw_ios_per_sec": 0, 00:16:32.356 "rw_mbytes_per_sec": 0, 00:16:32.356 "r_mbytes_per_sec": 0, 00:16:32.356 "w_mbytes_per_sec": 0 00:16:32.356 }, 00:16:32.356 "claimed": true, 00:16:32.356 "claim_type": "exclusive_write", 00:16:32.356 "zoned": false, 00:16:32.356 "supported_io_types": { 00:16:32.356 "read": true, 00:16:32.356 "write": true, 00:16:32.356 "unmap": true, 00:16:32.356 "flush": true, 00:16:32.356 "reset": true, 00:16:32.356 "nvme_admin": false, 00:16:32.356 "nvme_io": false, 00:16:32.356 "nvme_io_md": false, 00:16:32.356 "write_zeroes": true, 00:16:32.356 "zcopy": true, 00:16:32.356 "get_zone_info": false, 00:16:32.356 "zone_management": false, 00:16:32.356 "zone_append": false, 00:16:32.356 "compare": false, 00:16:32.356 "compare_and_write": false, 00:16:32.356 "abort": true, 00:16:32.356 "seek_hole": false, 00:16:32.356 "seek_data": false, 00:16:32.356 "copy": true, 00:16:32.356 "nvme_iov_md": false 00:16:32.356 }, 00:16:32.356 "memory_domains": [ 00:16:32.356 { 00:16:32.356 "dma_device_id": "system", 00:16:32.356 "dma_device_type": 1 00:16:32.356 }, 00:16:32.356 { 00:16:32.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.356 "dma_device_type": 2 00:16:32.356 } 00:16:32.356 ], 00:16:32.356 "driver_specific": {} 00:16:32.356 }' 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.356 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.615 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.615 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.615 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.615 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.615 07:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.615 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.615 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.615 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:32.615 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.874 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.874 "name": "BaseBdev2", 00:16:32.874 "aliases": [ 00:16:32.874 "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4" 00:16:32.874 ], 00:16:32.874 "product_name": "Malloc disk", 00:16:32.874 "block_size": 512, 00:16:32.874 "num_blocks": 65536, 00:16:32.874 "uuid": "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4", 00:16:32.874 "assigned_rate_limits": { 00:16:32.874 "rw_ios_per_sec": 0, 00:16:32.874 "rw_mbytes_per_sec": 0, 00:16:32.874 "r_mbytes_per_sec": 0, 00:16:32.874 "w_mbytes_per_sec": 0 00:16:32.874 }, 00:16:32.874 "claimed": true, 00:16:32.874 "claim_type": "exclusive_write", 00:16:32.874 "zoned": false, 00:16:32.874 "supported_io_types": { 00:16:32.874 "read": true, 00:16:32.874 "write": true, 00:16:32.874 "unmap": true, 00:16:32.874 "flush": true, 00:16:32.874 "reset": true, 00:16:32.874 "nvme_admin": false, 00:16:32.874 "nvme_io": false, 00:16:32.874 "nvme_io_md": false, 00:16:32.874 "write_zeroes": true, 00:16:32.874 "zcopy": true, 00:16:32.874 "get_zone_info": false, 00:16:32.874 "zone_management": false, 00:16:32.874 "zone_append": false, 00:16:32.874 "compare": false, 00:16:32.874 "compare_and_write": false, 00:16:32.874 "abort": true, 00:16:32.874 "seek_hole": false, 00:16:32.874 "seek_data": false, 00:16:32.874 "copy": true, 00:16:32.874 "nvme_iov_md": false 00:16:32.874 }, 00:16:32.874 "memory_domains": [ 00:16:32.874 { 00:16:32.874 "dma_device_id": "system", 00:16:32.874 "dma_device_type": 1 00:16:32.874 }, 00:16:32.874 { 00:16:32.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.874 "dma_device_type": 2 00:16:32.874 } 00:16:32.874 ], 00:16:32.874 "driver_specific": {} 00:16:32.874 }' 00:16:32.874 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.874 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.874 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.874 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.875 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:33.134 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.393 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.393 "name": "BaseBdev3", 00:16:33.393 "aliases": [ 00:16:33.393 "57b94508-a3da-4ada-ae85-9134ab2f0e22" 00:16:33.393 ], 00:16:33.393 "product_name": "Malloc disk", 00:16:33.393 "block_size": 512, 00:16:33.393 "num_blocks": 65536, 00:16:33.393 "uuid": "57b94508-a3da-4ada-ae85-9134ab2f0e22", 00:16:33.393 "assigned_rate_limits": { 00:16:33.393 "rw_ios_per_sec": 0, 00:16:33.393 "rw_mbytes_per_sec": 0, 00:16:33.393 "r_mbytes_per_sec": 0, 00:16:33.393 "w_mbytes_per_sec": 0 00:16:33.393 }, 00:16:33.393 "claimed": true, 00:16:33.393 "claim_type": "exclusive_write", 00:16:33.393 "zoned": false, 00:16:33.393 "supported_io_types": { 00:16:33.393 "read": true, 00:16:33.393 "write": true, 00:16:33.393 "unmap": true, 00:16:33.393 "flush": true, 00:16:33.393 "reset": true, 00:16:33.393 "nvme_admin": false, 00:16:33.393 "nvme_io": false, 00:16:33.393 "nvme_io_md": false, 00:16:33.393 "write_zeroes": true, 00:16:33.393 "zcopy": true, 00:16:33.393 "get_zone_info": false, 00:16:33.393 "zone_management": false, 00:16:33.393 "zone_append": false, 00:16:33.393 "compare": false, 00:16:33.393 "compare_and_write": false, 00:16:33.393 "abort": true, 00:16:33.393 "seek_hole": false, 00:16:33.393 "seek_data": false, 00:16:33.393 "copy": true, 00:16:33.393 "nvme_iov_md": false 00:16:33.393 }, 00:16:33.393 "memory_domains": [ 00:16:33.393 { 00:16:33.393 "dma_device_id": "system", 00:16:33.393 "dma_device_type": 1 00:16:33.393 }, 00:16:33.393 { 00:16:33.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.393 "dma_device_type": 2 00:16:33.393 } 00:16:33.393 ], 00:16:33.393 "driver_specific": {} 00:16:33.393 }' 00:16:33.393 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.393 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.393 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.393 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.652 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.652 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.652 07:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.652 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.652 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.652 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.652 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.653 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.653 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:33.912 [2024-07-25 07:22:06.355612] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:33.912 [2024-07-25 07:22:06.355635] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:33.912 [2024-07-25 07:22:06.355672] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.912 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.171 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.171 "name": "Existed_Raid", 00:16:34.171 "uuid": "bacdb68b-8b4a-4431-84f8-e413560e17e5", 00:16:34.171 "strip_size_kb": 64, 00:16:34.171 "state": "offline", 00:16:34.171 "raid_level": "concat", 00:16:34.171 "superblock": true, 00:16:34.171 "num_base_bdevs": 3, 00:16:34.171 "num_base_bdevs_discovered": 2, 00:16:34.171 "num_base_bdevs_operational": 2, 00:16:34.171 "base_bdevs_list": [ 00:16:34.171 { 00:16:34.171 "name": null, 00:16:34.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.171 "is_configured": false, 00:16:34.171 "data_offset": 2048, 00:16:34.171 "data_size": 63488 00:16:34.171 }, 00:16:34.171 { 00:16:34.171 "name": "BaseBdev2", 00:16:34.171 "uuid": "a6c7c994-0024-4dc1-981d-02d4dfb1f0a4", 00:16:34.171 "is_configured": true, 00:16:34.171 "data_offset": 2048, 00:16:34.171 "data_size": 63488 00:16:34.171 }, 00:16:34.171 { 00:16:34.171 "name": "BaseBdev3", 00:16:34.171 "uuid": "57b94508-a3da-4ada-ae85-9134ab2f0e22", 00:16:34.171 "is_configured": true, 00:16:34.171 "data_offset": 2048, 00:16:34.171 "data_size": 63488 00:16:34.171 } 00:16:34.171 ] 00:16:34.171 }' 00:16:34.171 07:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.171 07:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.739 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:34.739 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:34.739 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.739 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:34.997 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:34.997 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:34.997 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:35.256 [2024-07-25 07:22:07.595824] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:35.256 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:35.256 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:35.256 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.256 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:35.514 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:35.514 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:35.514 07:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:35.773 [2024-07-25 07:22:08.067148] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:35.773 [2024-07-25 07:22:08.067185] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf98680 name Existed_Raid, state offline 00:16:35.773 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:35.773 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:35.773 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.773 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:36.032 BaseBdev2 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:36.032 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.291 07:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:36.550 [ 00:16:36.550 { 00:16:36.550 "name": "BaseBdev2", 00:16:36.550 "aliases": [ 00:16:36.550 "0683d28c-f42c-4645-8e5d-366a2fc4c4c8" 00:16:36.550 ], 00:16:36.550 "product_name": "Malloc disk", 00:16:36.550 "block_size": 512, 00:16:36.550 "num_blocks": 65536, 00:16:36.550 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:36.550 "assigned_rate_limits": { 00:16:36.550 "rw_ios_per_sec": 0, 00:16:36.550 "rw_mbytes_per_sec": 0, 00:16:36.550 "r_mbytes_per_sec": 0, 00:16:36.550 "w_mbytes_per_sec": 0 00:16:36.550 }, 00:16:36.550 "claimed": false, 00:16:36.550 "zoned": false, 00:16:36.550 "supported_io_types": { 00:16:36.550 "read": true, 00:16:36.550 "write": true, 00:16:36.550 "unmap": true, 00:16:36.550 "flush": true, 00:16:36.550 "reset": true, 00:16:36.550 "nvme_admin": false, 00:16:36.550 "nvme_io": false, 00:16:36.550 "nvme_io_md": false, 00:16:36.550 "write_zeroes": true, 00:16:36.550 "zcopy": true, 00:16:36.550 "get_zone_info": false, 00:16:36.550 "zone_management": false, 00:16:36.550 "zone_append": false, 00:16:36.550 "compare": false, 00:16:36.550 "compare_and_write": false, 00:16:36.550 "abort": true, 00:16:36.550 "seek_hole": false, 00:16:36.550 "seek_data": false, 00:16:36.550 "copy": true, 00:16:36.550 "nvme_iov_md": false 00:16:36.550 }, 00:16:36.550 "memory_domains": [ 00:16:36.550 { 00:16:36.550 "dma_device_id": "system", 00:16:36.550 "dma_device_type": 1 00:16:36.550 }, 00:16:36.550 { 00:16:36.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.550 "dma_device_type": 2 00:16:36.550 } 00:16:36.550 ], 00:16:36.550 "driver_specific": {} 00:16:36.550 } 00:16:36.550 ] 00:16:36.550 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:36.550 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:36.550 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:36.550 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:36.809 BaseBdev3 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:36.809 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.067 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:37.326 [ 00:16:37.326 { 00:16:37.326 "name": "BaseBdev3", 00:16:37.326 "aliases": [ 00:16:37.326 "16977ff9-5d23-4e63-a28a-f056bd487944" 00:16:37.326 ], 00:16:37.326 "product_name": "Malloc disk", 00:16:37.326 "block_size": 512, 00:16:37.326 "num_blocks": 65536, 00:16:37.326 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:37.326 "assigned_rate_limits": { 00:16:37.326 "rw_ios_per_sec": 0, 00:16:37.326 "rw_mbytes_per_sec": 0, 00:16:37.326 "r_mbytes_per_sec": 0, 00:16:37.326 "w_mbytes_per_sec": 0 00:16:37.326 }, 00:16:37.326 "claimed": false, 00:16:37.326 "zoned": false, 00:16:37.326 "supported_io_types": { 00:16:37.326 "read": true, 00:16:37.326 "write": true, 00:16:37.326 "unmap": true, 00:16:37.326 "flush": true, 00:16:37.326 "reset": true, 00:16:37.326 "nvme_admin": false, 00:16:37.326 "nvme_io": false, 00:16:37.326 "nvme_io_md": false, 00:16:37.326 "write_zeroes": true, 00:16:37.326 "zcopy": true, 00:16:37.326 "get_zone_info": false, 00:16:37.326 "zone_management": false, 00:16:37.326 "zone_append": false, 00:16:37.326 "compare": false, 00:16:37.326 "compare_and_write": false, 00:16:37.326 "abort": true, 00:16:37.326 "seek_hole": false, 00:16:37.326 "seek_data": false, 00:16:37.326 "copy": true, 00:16:37.326 "nvme_iov_md": false 00:16:37.326 }, 00:16:37.326 "memory_domains": [ 00:16:37.326 { 00:16:37.326 "dma_device_id": "system", 00:16:37.326 "dma_device_type": 1 00:16:37.326 }, 00:16:37.326 { 00:16:37.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.326 "dma_device_type": 2 00:16:37.326 } 00:16:37.326 ], 00:16:37.326 "driver_specific": {} 00:16:37.326 } 00:16:37.326 ] 00:16:37.326 07:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:37.326 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:37.326 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:37.326 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:37.585 [2024-07-25 07:22:09.886647] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:37.585 [2024-07-25 07:22:09.886684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:37.585 [2024-07-25 07:22:09.886701] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:37.585 [2024-07-25 07:22:09.887902] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.585 07:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.844 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.844 "name": "Existed_Raid", 00:16:37.844 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:37.844 "strip_size_kb": 64, 00:16:37.844 "state": "configuring", 00:16:37.844 "raid_level": "concat", 00:16:37.844 "superblock": true, 00:16:37.844 "num_base_bdevs": 3, 00:16:37.844 "num_base_bdevs_discovered": 2, 00:16:37.844 "num_base_bdevs_operational": 3, 00:16:37.844 "base_bdevs_list": [ 00:16:37.844 { 00:16:37.844 "name": "BaseBdev1", 00:16:37.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.844 "is_configured": false, 00:16:37.844 "data_offset": 0, 00:16:37.844 "data_size": 0 00:16:37.844 }, 00:16:37.844 { 00:16:37.844 "name": "BaseBdev2", 00:16:37.844 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:37.844 "is_configured": true, 00:16:37.844 "data_offset": 2048, 00:16:37.844 "data_size": 63488 00:16:37.844 }, 00:16:37.844 { 00:16:37.844 "name": "BaseBdev3", 00:16:37.844 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:37.844 "is_configured": true, 00:16:37.844 "data_offset": 2048, 00:16:37.844 "data_size": 63488 00:16:37.844 } 00:16:37.844 ] 00:16:37.844 }' 00:16:37.844 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.844 07:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:38.411 [2024-07-25 07:22:10.921390] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.411 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.671 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.671 07:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.671 07:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.671 "name": "Existed_Raid", 00:16:38.671 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:38.671 "strip_size_kb": 64, 00:16:38.671 "state": "configuring", 00:16:38.671 "raid_level": "concat", 00:16:38.671 "superblock": true, 00:16:38.671 "num_base_bdevs": 3, 00:16:38.671 "num_base_bdevs_discovered": 1, 00:16:38.671 "num_base_bdevs_operational": 3, 00:16:38.671 "base_bdevs_list": [ 00:16:38.671 { 00:16:38.671 "name": "BaseBdev1", 00:16:38.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:38.671 "is_configured": false, 00:16:38.671 "data_offset": 0, 00:16:38.671 "data_size": 0 00:16:38.671 }, 00:16:38.671 { 00:16:38.671 "name": null, 00:16:38.671 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:38.671 "is_configured": false, 00:16:38.671 "data_offset": 2048, 00:16:38.671 "data_size": 63488 00:16:38.671 }, 00:16:38.671 { 00:16:38.671 "name": "BaseBdev3", 00:16:38.671 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:38.671 "is_configured": true, 00:16:38.671 "data_offset": 2048, 00:16:38.671 "data_size": 63488 00:16:38.671 } 00:16:38.671 ] 00:16:38.671 }' 00:16:38.671 07:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.671 07:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:39.276 07:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.276 07:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:39.535 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:39.535 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:39.793 [2024-07-25 07:22:12.231892] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:39.793 BaseBdev1 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:39.793 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.052 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:40.311 [ 00:16:40.311 { 00:16:40.311 "name": "BaseBdev1", 00:16:40.311 "aliases": [ 00:16:40.311 "15b8efa4-c366-4ce7-93bc-70311b6c7b2b" 00:16:40.311 ], 00:16:40.311 "product_name": "Malloc disk", 00:16:40.311 "block_size": 512, 00:16:40.311 "num_blocks": 65536, 00:16:40.311 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:40.311 "assigned_rate_limits": { 00:16:40.311 "rw_ios_per_sec": 0, 00:16:40.311 "rw_mbytes_per_sec": 0, 00:16:40.311 "r_mbytes_per_sec": 0, 00:16:40.311 "w_mbytes_per_sec": 0 00:16:40.311 }, 00:16:40.311 "claimed": true, 00:16:40.311 "claim_type": "exclusive_write", 00:16:40.311 "zoned": false, 00:16:40.311 "supported_io_types": { 00:16:40.311 "read": true, 00:16:40.311 "write": true, 00:16:40.311 "unmap": true, 00:16:40.311 "flush": true, 00:16:40.311 "reset": true, 00:16:40.311 "nvme_admin": false, 00:16:40.311 "nvme_io": false, 00:16:40.311 "nvme_io_md": false, 00:16:40.311 "write_zeroes": true, 00:16:40.311 "zcopy": true, 00:16:40.311 "get_zone_info": false, 00:16:40.311 "zone_management": false, 00:16:40.311 "zone_append": false, 00:16:40.311 "compare": false, 00:16:40.311 "compare_and_write": false, 00:16:40.311 "abort": true, 00:16:40.311 "seek_hole": false, 00:16:40.311 "seek_data": false, 00:16:40.311 "copy": true, 00:16:40.311 "nvme_iov_md": false 00:16:40.311 }, 00:16:40.311 "memory_domains": [ 00:16:40.311 { 00:16:40.311 "dma_device_id": "system", 00:16:40.311 "dma_device_type": 1 00:16:40.311 }, 00:16:40.311 { 00:16:40.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.311 "dma_device_type": 2 00:16:40.311 } 00:16:40.311 ], 00:16:40.311 "driver_specific": {} 00:16:40.311 } 00:16:40.311 ] 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.311 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.570 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.570 "name": "Existed_Raid", 00:16:40.570 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:40.570 "strip_size_kb": 64, 00:16:40.570 "state": "configuring", 00:16:40.570 "raid_level": "concat", 00:16:40.570 "superblock": true, 00:16:40.570 "num_base_bdevs": 3, 00:16:40.570 "num_base_bdevs_discovered": 2, 00:16:40.570 "num_base_bdevs_operational": 3, 00:16:40.570 "base_bdevs_list": [ 00:16:40.570 { 00:16:40.570 "name": "BaseBdev1", 00:16:40.570 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:40.570 "is_configured": true, 00:16:40.570 "data_offset": 2048, 00:16:40.570 "data_size": 63488 00:16:40.570 }, 00:16:40.570 { 00:16:40.570 "name": null, 00:16:40.570 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:40.570 "is_configured": false, 00:16:40.570 "data_offset": 2048, 00:16:40.570 "data_size": 63488 00:16:40.570 }, 00:16:40.570 { 00:16:40.570 "name": "BaseBdev3", 00:16:40.570 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:40.570 "is_configured": true, 00:16:40.570 "data_offset": 2048, 00:16:40.570 "data_size": 63488 00:16:40.570 } 00:16:40.570 ] 00:16:40.570 }' 00:16:40.570 07:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.570 07:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.137 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.137 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:41.396 [2024-07-25 07:22:13.904329] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.396 07:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.655 07:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.655 "name": "Existed_Raid", 00:16:41.655 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:41.655 "strip_size_kb": 64, 00:16:41.655 "state": "configuring", 00:16:41.655 "raid_level": "concat", 00:16:41.655 "superblock": true, 00:16:41.655 "num_base_bdevs": 3, 00:16:41.655 "num_base_bdevs_discovered": 1, 00:16:41.655 "num_base_bdevs_operational": 3, 00:16:41.655 "base_bdevs_list": [ 00:16:41.655 { 00:16:41.655 "name": "BaseBdev1", 00:16:41.655 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:41.655 "is_configured": true, 00:16:41.655 "data_offset": 2048, 00:16:41.655 "data_size": 63488 00:16:41.655 }, 00:16:41.655 { 00:16:41.655 "name": null, 00:16:41.655 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:41.655 "is_configured": false, 00:16:41.655 "data_offset": 2048, 00:16:41.655 "data_size": 63488 00:16:41.655 }, 00:16:41.655 { 00:16:41.655 "name": null, 00:16:41.655 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:41.655 "is_configured": false, 00:16:41.655 "data_offset": 2048, 00:16:41.655 "data_size": 63488 00:16:41.655 } 00:16:41.655 ] 00:16:41.655 }' 00:16:41.655 07:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.655 07:22:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.223 07:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.223 07:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:42.482 07:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:42.482 07:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:42.741 [2024-07-25 07:22:15.111507] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.741 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.000 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.000 "name": "Existed_Raid", 00:16:43.000 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:43.000 "strip_size_kb": 64, 00:16:43.000 "state": "configuring", 00:16:43.000 "raid_level": "concat", 00:16:43.000 "superblock": true, 00:16:43.000 "num_base_bdevs": 3, 00:16:43.000 "num_base_bdevs_discovered": 2, 00:16:43.000 "num_base_bdevs_operational": 3, 00:16:43.000 "base_bdevs_list": [ 00:16:43.000 { 00:16:43.000 "name": "BaseBdev1", 00:16:43.000 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:43.000 "is_configured": true, 00:16:43.000 "data_offset": 2048, 00:16:43.000 "data_size": 63488 00:16:43.000 }, 00:16:43.000 { 00:16:43.000 "name": null, 00:16:43.000 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:43.000 "is_configured": false, 00:16:43.000 "data_offset": 2048, 00:16:43.000 "data_size": 63488 00:16:43.000 }, 00:16:43.000 { 00:16:43.000 "name": "BaseBdev3", 00:16:43.000 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:43.000 "is_configured": true, 00:16:43.000 "data_offset": 2048, 00:16:43.000 "data_size": 63488 00:16:43.000 } 00:16:43.000 ] 00:16:43.000 }' 00:16:43.000 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.000 07:22:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.568 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.568 07:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:43.826 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:43.826 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:44.085 [2024-07-25 07:22:16.382871] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.085 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.343 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.343 "name": "Existed_Raid", 00:16:44.343 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:44.343 "strip_size_kb": 64, 00:16:44.343 "state": "configuring", 00:16:44.343 "raid_level": "concat", 00:16:44.343 "superblock": true, 00:16:44.343 "num_base_bdevs": 3, 00:16:44.343 "num_base_bdevs_discovered": 1, 00:16:44.343 "num_base_bdevs_operational": 3, 00:16:44.343 "base_bdevs_list": [ 00:16:44.343 { 00:16:44.343 "name": null, 00:16:44.343 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:44.343 "is_configured": false, 00:16:44.343 "data_offset": 2048, 00:16:44.343 "data_size": 63488 00:16:44.343 }, 00:16:44.343 { 00:16:44.343 "name": null, 00:16:44.343 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:44.343 "is_configured": false, 00:16:44.343 "data_offset": 2048, 00:16:44.343 "data_size": 63488 00:16:44.343 }, 00:16:44.343 { 00:16:44.343 "name": "BaseBdev3", 00:16:44.343 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:44.343 "is_configured": true, 00:16:44.343 "data_offset": 2048, 00:16:44.343 "data_size": 63488 00:16:44.343 } 00:16:44.343 ] 00:16:44.343 }' 00:16:44.343 07:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.343 07:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.910 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.910 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:44.910 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:44.910 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:45.168 [2024-07-25 07:22:17.628018] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.168 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.427 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.427 "name": "Existed_Raid", 00:16:45.427 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:45.427 "strip_size_kb": 64, 00:16:45.427 "state": "configuring", 00:16:45.427 "raid_level": "concat", 00:16:45.427 "superblock": true, 00:16:45.427 "num_base_bdevs": 3, 00:16:45.427 "num_base_bdevs_discovered": 2, 00:16:45.427 "num_base_bdevs_operational": 3, 00:16:45.427 "base_bdevs_list": [ 00:16:45.427 { 00:16:45.427 "name": null, 00:16:45.427 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:45.427 "is_configured": false, 00:16:45.427 "data_offset": 2048, 00:16:45.427 "data_size": 63488 00:16:45.427 }, 00:16:45.427 { 00:16:45.427 "name": "BaseBdev2", 00:16:45.427 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:45.427 "is_configured": true, 00:16:45.427 "data_offset": 2048, 00:16:45.427 "data_size": 63488 00:16:45.427 }, 00:16:45.427 { 00:16:45.427 "name": "BaseBdev3", 00:16:45.427 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:45.427 "is_configured": true, 00:16:45.427 "data_offset": 2048, 00:16:45.427 "data_size": 63488 00:16:45.427 } 00:16:45.427 ] 00:16:45.427 }' 00:16:45.427 07:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.427 07:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.993 07:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.993 07:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:46.251 07:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:46.251 07:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.251 07:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:46.509 07:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 15b8efa4-c366-4ce7-93bc-70311b6c7b2b 00:16:46.767 [2024-07-25 07:22:19.103086] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:46.767 [2024-07-25 07:22:19.103226] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1146000 00:16:46.767 [2024-07-25 07:22:19.103239] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:46.767 [2024-07-25 07:22:19.103401] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf98350 00:16:46.767 [2024-07-25 07:22:19.103506] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1146000 00:16:46.767 [2024-07-25 07:22:19.103515] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1146000 00:16:46.767 [2024-07-25 07:22:19.103597] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:46.767 NewBaseBdev 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:46.767 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:47.027 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:47.027 [ 00:16:47.027 { 00:16:47.027 "name": "NewBaseBdev", 00:16:47.027 "aliases": [ 00:16:47.027 "15b8efa4-c366-4ce7-93bc-70311b6c7b2b" 00:16:47.027 ], 00:16:47.027 "product_name": "Malloc disk", 00:16:47.027 "block_size": 512, 00:16:47.027 "num_blocks": 65536, 00:16:47.027 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:47.027 "assigned_rate_limits": { 00:16:47.027 "rw_ios_per_sec": 0, 00:16:47.027 "rw_mbytes_per_sec": 0, 00:16:47.027 "r_mbytes_per_sec": 0, 00:16:47.027 "w_mbytes_per_sec": 0 00:16:47.027 }, 00:16:47.027 "claimed": true, 00:16:47.027 "claim_type": "exclusive_write", 00:16:47.027 "zoned": false, 00:16:47.027 "supported_io_types": { 00:16:47.027 "read": true, 00:16:47.027 "write": true, 00:16:47.027 "unmap": true, 00:16:47.027 "flush": true, 00:16:47.027 "reset": true, 00:16:47.027 "nvme_admin": false, 00:16:47.027 "nvme_io": false, 00:16:47.027 "nvme_io_md": false, 00:16:47.027 "write_zeroes": true, 00:16:47.027 "zcopy": true, 00:16:47.027 "get_zone_info": false, 00:16:47.027 "zone_management": false, 00:16:47.027 "zone_append": false, 00:16:47.027 "compare": false, 00:16:47.027 "compare_and_write": false, 00:16:47.027 "abort": true, 00:16:47.027 "seek_hole": false, 00:16:47.027 "seek_data": false, 00:16:47.027 "copy": true, 00:16:47.027 "nvme_iov_md": false 00:16:47.027 }, 00:16:47.027 "memory_domains": [ 00:16:47.027 { 00:16:47.027 "dma_device_id": "system", 00:16:47.027 "dma_device_type": 1 00:16:47.027 }, 00:16:47.027 { 00:16:47.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.027 "dma_device_type": 2 00:16:47.027 } 00:16:47.027 ], 00:16:47.027 "driver_specific": {} 00:16:47.027 } 00:16:47.027 ] 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.287 "name": "Existed_Raid", 00:16:47.287 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:47.287 "strip_size_kb": 64, 00:16:47.287 "state": "online", 00:16:47.287 "raid_level": "concat", 00:16:47.287 "superblock": true, 00:16:47.287 "num_base_bdevs": 3, 00:16:47.287 "num_base_bdevs_discovered": 3, 00:16:47.287 "num_base_bdevs_operational": 3, 00:16:47.287 "base_bdevs_list": [ 00:16:47.287 { 00:16:47.287 "name": "NewBaseBdev", 00:16:47.287 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:47.287 "is_configured": true, 00:16:47.287 "data_offset": 2048, 00:16:47.287 "data_size": 63488 00:16:47.287 }, 00:16:47.287 { 00:16:47.287 "name": "BaseBdev2", 00:16:47.287 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:47.287 "is_configured": true, 00:16:47.287 "data_offset": 2048, 00:16:47.287 "data_size": 63488 00:16:47.287 }, 00:16:47.287 { 00:16:47.287 "name": "BaseBdev3", 00:16:47.287 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:47.287 "is_configured": true, 00:16:47.287 "data_offset": 2048, 00:16:47.287 "data_size": 63488 00:16:47.287 } 00:16:47.287 ] 00:16:47.287 }' 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.287 07:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:47.855 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:48.114 [2024-07-25 07:22:20.595293] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:48.114 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:48.114 "name": "Existed_Raid", 00:16:48.114 "aliases": [ 00:16:48.114 "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5" 00:16:48.114 ], 00:16:48.114 "product_name": "Raid Volume", 00:16:48.114 "block_size": 512, 00:16:48.114 "num_blocks": 190464, 00:16:48.114 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:48.114 "assigned_rate_limits": { 00:16:48.114 "rw_ios_per_sec": 0, 00:16:48.114 "rw_mbytes_per_sec": 0, 00:16:48.114 "r_mbytes_per_sec": 0, 00:16:48.114 "w_mbytes_per_sec": 0 00:16:48.114 }, 00:16:48.114 "claimed": false, 00:16:48.114 "zoned": false, 00:16:48.114 "supported_io_types": { 00:16:48.114 "read": true, 00:16:48.114 "write": true, 00:16:48.114 "unmap": true, 00:16:48.114 "flush": true, 00:16:48.114 "reset": true, 00:16:48.114 "nvme_admin": false, 00:16:48.114 "nvme_io": false, 00:16:48.114 "nvme_io_md": false, 00:16:48.114 "write_zeroes": true, 00:16:48.114 "zcopy": false, 00:16:48.114 "get_zone_info": false, 00:16:48.114 "zone_management": false, 00:16:48.114 "zone_append": false, 00:16:48.114 "compare": false, 00:16:48.114 "compare_and_write": false, 00:16:48.114 "abort": false, 00:16:48.114 "seek_hole": false, 00:16:48.114 "seek_data": false, 00:16:48.114 "copy": false, 00:16:48.114 "nvme_iov_md": false 00:16:48.114 }, 00:16:48.114 "memory_domains": [ 00:16:48.114 { 00:16:48.114 "dma_device_id": "system", 00:16:48.114 "dma_device_type": 1 00:16:48.114 }, 00:16:48.114 { 00:16:48.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.114 "dma_device_type": 2 00:16:48.114 }, 00:16:48.114 { 00:16:48.114 "dma_device_id": "system", 00:16:48.114 "dma_device_type": 1 00:16:48.114 }, 00:16:48.114 { 00:16:48.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.114 "dma_device_type": 2 00:16:48.114 }, 00:16:48.114 { 00:16:48.114 "dma_device_id": "system", 00:16:48.114 "dma_device_type": 1 00:16:48.114 }, 00:16:48.114 { 00:16:48.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.114 "dma_device_type": 2 00:16:48.114 } 00:16:48.114 ], 00:16:48.114 "driver_specific": { 00:16:48.114 "raid": { 00:16:48.114 "uuid": "cd93d20a-4fc0-4c48-b0d3-f58d0860bfc5", 00:16:48.114 "strip_size_kb": 64, 00:16:48.114 "state": "online", 00:16:48.114 "raid_level": "concat", 00:16:48.114 "superblock": true, 00:16:48.114 "num_base_bdevs": 3, 00:16:48.114 "num_base_bdevs_discovered": 3, 00:16:48.114 "num_base_bdevs_operational": 3, 00:16:48.114 "base_bdevs_list": [ 00:16:48.115 { 00:16:48.115 "name": "NewBaseBdev", 00:16:48.115 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:48.115 "is_configured": true, 00:16:48.115 "data_offset": 2048, 00:16:48.115 "data_size": 63488 00:16:48.115 }, 00:16:48.115 { 00:16:48.115 "name": "BaseBdev2", 00:16:48.115 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:48.115 "is_configured": true, 00:16:48.115 "data_offset": 2048, 00:16:48.115 "data_size": 63488 00:16:48.115 }, 00:16:48.115 { 00:16:48.115 "name": "BaseBdev3", 00:16:48.115 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:48.115 "is_configured": true, 00:16:48.115 "data_offset": 2048, 00:16:48.115 "data_size": 63488 00:16:48.115 } 00:16:48.115 ] 00:16:48.115 } 00:16:48.115 } 00:16:48.115 }' 00:16:48.115 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:48.372 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:48.372 BaseBdev2 00:16:48.372 BaseBdev3' 00:16:48.372 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.372 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:48.372 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.372 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.372 "name": "NewBaseBdev", 00:16:48.372 "aliases": [ 00:16:48.372 "15b8efa4-c366-4ce7-93bc-70311b6c7b2b" 00:16:48.372 ], 00:16:48.372 "product_name": "Malloc disk", 00:16:48.372 "block_size": 512, 00:16:48.372 "num_blocks": 65536, 00:16:48.372 "uuid": "15b8efa4-c366-4ce7-93bc-70311b6c7b2b", 00:16:48.372 "assigned_rate_limits": { 00:16:48.372 "rw_ios_per_sec": 0, 00:16:48.372 "rw_mbytes_per_sec": 0, 00:16:48.372 "r_mbytes_per_sec": 0, 00:16:48.372 "w_mbytes_per_sec": 0 00:16:48.372 }, 00:16:48.372 "claimed": true, 00:16:48.372 "claim_type": "exclusive_write", 00:16:48.372 "zoned": false, 00:16:48.372 "supported_io_types": { 00:16:48.372 "read": true, 00:16:48.372 "write": true, 00:16:48.372 "unmap": true, 00:16:48.372 "flush": true, 00:16:48.372 "reset": true, 00:16:48.372 "nvme_admin": false, 00:16:48.372 "nvme_io": false, 00:16:48.372 "nvme_io_md": false, 00:16:48.372 "write_zeroes": true, 00:16:48.372 "zcopy": true, 00:16:48.372 "get_zone_info": false, 00:16:48.372 "zone_management": false, 00:16:48.372 "zone_append": false, 00:16:48.372 "compare": false, 00:16:48.372 "compare_and_write": false, 00:16:48.372 "abort": true, 00:16:48.372 "seek_hole": false, 00:16:48.372 "seek_data": false, 00:16:48.372 "copy": true, 00:16:48.372 "nvme_iov_md": false 00:16:48.372 }, 00:16:48.372 "memory_domains": [ 00:16:48.372 { 00:16:48.372 "dma_device_id": "system", 00:16:48.372 "dma_device_type": 1 00:16:48.372 }, 00:16:48.372 { 00:16:48.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.372 "dma_device_type": 2 00:16:48.372 } 00:16:48.372 ], 00:16:48.372 "driver_specific": {} 00:16:48.372 }' 00:16:48.372 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.630 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.630 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.630 07:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.630 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.630 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:48.630 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.630 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.630 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:48.630 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.889 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.890 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:48.890 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.890 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:48.890 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:49.148 "name": "BaseBdev2", 00:16:49.148 "aliases": [ 00:16:49.148 "0683d28c-f42c-4645-8e5d-366a2fc4c4c8" 00:16:49.148 ], 00:16:49.148 "product_name": "Malloc disk", 00:16:49.148 "block_size": 512, 00:16:49.148 "num_blocks": 65536, 00:16:49.148 "uuid": "0683d28c-f42c-4645-8e5d-366a2fc4c4c8", 00:16:49.148 "assigned_rate_limits": { 00:16:49.148 "rw_ios_per_sec": 0, 00:16:49.148 "rw_mbytes_per_sec": 0, 00:16:49.148 "r_mbytes_per_sec": 0, 00:16:49.148 "w_mbytes_per_sec": 0 00:16:49.148 }, 00:16:49.148 "claimed": true, 00:16:49.148 "claim_type": "exclusive_write", 00:16:49.148 "zoned": false, 00:16:49.148 "supported_io_types": { 00:16:49.148 "read": true, 00:16:49.148 "write": true, 00:16:49.148 "unmap": true, 00:16:49.148 "flush": true, 00:16:49.148 "reset": true, 00:16:49.148 "nvme_admin": false, 00:16:49.148 "nvme_io": false, 00:16:49.148 "nvme_io_md": false, 00:16:49.148 "write_zeroes": true, 00:16:49.148 "zcopy": true, 00:16:49.148 "get_zone_info": false, 00:16:49.148 "zone_management": false, 00:16:49.148 "zone_append": false, 00:16:49.148 "compare": false, 00:16:49.148 "compare_and_write": false, 00:16:49.148 "abort": true, 00:16:49.148 "seek_hole": false, 00:16:49.148 "seek_data": false, 00:16:49.148 "copy": true, 00:16:49.148 "nvme_iov_md": false 00:16:49.148 }, 00:16:49.148 "memory_domains": [ 00:16:49.148 { 00:16:49.148 "dma_device_id": "system", 00:16:49.148 "dma_device_type": 1 00:16:49.148 }, 00:16:49.148 { 00:16:49.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.148 "dma_device_type": 2 00:16:49.148 } 00:16:49.148 ], 00:16:49.148 "driver_specific": {} 00:16:49.148 }' 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.148 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.407 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.408 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.408 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.408 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.408 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:49.408 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:49.408 07:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:49.666 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:49.666 "name": "BaseBdev3", 00:16:49.666 "aliases": [ 00:16:49.666 "16977ff9-5d23-4e63-a28a-f056bd487944" 00:16:49.666 ], 00:16:49.666 "product_name": "Malloc disk", 00:16:49.666 "block_size": 512, 00:16:49.666 "num_blocks": 65536, 00:16:49.666 "uuid": "16977ff9-5d23-4e63-a28a-f056bd487944", 00:16:49.666 "assigned_rate_limits": { 00:16:49.666 "rw_ios_per_sec": 0, 00:16:49.667 "rw_mbytes_per_sec": 0, 00:16:49.667 "r_mbytes_per_sec": 0, 00:16:49.667 "w_mbytes_per_sec": 0 00:16:49.667 }, 00:16:49.667 "claimed": true, 00:16:49.667 "claim_type": "exclusive_write", 00:16:49.667 "zoned": false, 00:16:49.667 "supported_io_types": { 00:16:49.667 "read": true, 00:16:49.667 "write": true, 00:16:49.667 "unmap": true, 00:16:49.667 "flush": true, 00:16:49.667 "reset": true, 00:16:49.667 "nvme_admin": false, 00:16:49.667 "nvme_io": false, 00:16:49.667 "nvme_io_md": false, 00:16:49.667 "write_zeroes": true, 00:16:49.667 "zcopy": true, 00:16:49.667 "get_zone_info": false, 00:16:49.667 "zone_management": false, 00:16:49.667 "zone_append": false, 00:16:49.667 "compare": false, 00:16:49.667 "compare_and_write": false, 00:16:49.667 "abort": true, 00:16:49.667 "seek_hole": false, 00:16:49.667 "seek_data": false, 00:16:49.667 "copy": true, 00:16:49.667 "nvme_iov_md": false 00:16:49.667 }, 00:16:49.667 "memory_domains": [ 00:16:49.667 { 00:16:49.667 "dma_device_id": "system", 00:16:49.667 "dma_device_type": 1 00:16:49.667 }, 00:16:49.667 { 00:16:49.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.667 "dma_device_type": 2 00:16:49.667 } 00:16:49.667 ], 00:16:49.667 "driver_specific": {} 00:16:49.667 }' 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.667 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.926 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.926 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.926 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.926 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.926 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.926 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:50.185 [2024-07-25 07:22:22.576252] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:50.185 [2024-07-25 07:22:22.576286] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:50.185 [2024-07-25 07:22:22.576336] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:50.185 [2024-07-25 07:22:22.576385] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:50.185 [2024-07-25 07:22:22.576396] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1146000 name Existed_Raid, state offline 00:16:50.185 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1632979 00:16:50.185 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1632979 ']' 00:16:50.185 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1632979 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1632979 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1632979' 00:16:50.186 killing process with pid 1632979 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1632979 00:16:50.186 [2024-07-25 07:22:22.652055] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:50.186 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1632979 00:16:50.186 [2024-07-25 07:22:22.675529] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:50.445 07:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:50.445 00:16:50.445 real 0m26.816s 00:16:50.445 user 0m49.127s 00:16:50.445 sys 0m4.931s 00:16:50.445 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:50.445 07:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.445 ************************************ 00:16:50.445 END TEST raid_state_function_test_sb 00:16:50.445 ************************************ 00:16:50.445 07:22:22 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:50.445 07:22:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:16:50.445 07:22:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:50.445 07:22:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:50.445 ************************************ 00:16:50.445 START TEST raid_superblock_test 00:16:50.445 ************************************ 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 3 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1638073 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1638073 /var/tmp/spdk-raid.sock 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1638073 ']' 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:50.445 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:50.446 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:50.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:50.446 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:50.446 07:22:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.705 [2024-07-25 07:22:23.005037] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:16:50.705 [2024-07-25 07:22:23.005093] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1638073 ] 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.0 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.1 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.2 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.3 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.4 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.5 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.6 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:01.7 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:02.0 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:02.1 cannot be used 00:16:50.705 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.705 EAL: Requested device 0000:3d:02.2 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3d:02.3 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3d:02.4 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3d:02.5 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3d:02.6 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3d:02.7 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.0 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.1 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.2 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.3 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.4 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.5 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.6 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:01.7 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.0 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.1 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.2 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.3 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.4 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.5 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.6 cannot be used 00:16:50.706 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:50.706 EAL: Requested device 0000:3f:02.7 cannot be used 00:16:50.706 [2024-07-25 07:22:23.138070] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:50.706 [2024-07-25 07:22:23.223728] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:50.965 [2024-07-25 07:22:23.283513] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:50.965 [2024-07-25 07:22:23.283549] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:51.532 07:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:51.808 malloc1 00:16:51.808 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:51.808 [2024-07-25 07:22:24.325610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:51.808 [2024-07-25 07:22:24.325650] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:51.808 [2024-07-25 07:22:24.325668] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a0280 00:16:51.808 [2024-07-25 07:22:24.325680] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:51.808 [2024-07-25 07:22:24.327200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:51.808 [2024-07-25 07:22:24.327227] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:51.808 pt1 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:52.078 malloc2 00:16:52.078 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:52.338 [2024-07-25 07:22:24.775364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:52.338 [2024-07-25 07:22:24.775404] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.338 [2024-07-25 07:22:24.775419] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274b8c0 00:16:52.338 [2024-07-25 07:22:24.775430] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.338 [2024-07-25 07:22:24.776744] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.338 [2024-07-25 07:22:24.776770] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:52.338 pt2 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:52.338 07:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:52.597 malloc3 00:16:52.597 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:52.857 [2024-07-25 07:22:25.208804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:52.857 [2024-07-25 07:22:25.208843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.857 [2024-07-25 07:22:25.208859] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274bef0 00:16:52.857 [2024-07-25 07:22:25.208870] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.857 [2024-07-25 07:22:25.210184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.857 [2024-07-25 07:22:25.210210] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:52.857 pt3 00:16:52.857 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:52.857 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:52.857 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:53.116 [2024-07-25 07:22:25.437426] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:53.116 [2024-07-25 07:22:25.438590] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:53.116 [2024-07-25 07:22:25.438641] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:53.116 [2024-07-25 07:22:25.438785] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x274f330 00:16:53.116 [2024-07-25 07:22:25.438796] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:53.116 [2024-07-25 07:22:25.438970] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b7170 00:16:53.116 [2024-07-25 07:22:25.439099] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x274f330 00:16:53.116 [2024-07-25 07:22:25.439109] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x274f330 00:16:53.116 [2024-07-25 07:22:25.439206] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.116 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:53.376 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.376 "name": "raid_bdev1", 00:16:53.376 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:16:53.376 "strip_size_kb": 64, 00:16:53.376 "state": "online", 00:16:53.376 "raid_level": "concat", 00:16:53.376 "superblock": true, 00:16:53.376 "num_base_bdevs": 3, 00:16:53.376 "num_base_bdevs_discovered": 3, 00:16:53.376 "num_base_bdevs_operational": 3, 00:16:53.376 "base_bdevs_list": [ 00:16:53.376 { 00:16:53.376 "name": "pt1", 00:16:53.376 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:53.376 "is_configured": true, 00:16:53.376 "data_offset": 2048, 00:16:53.376 "data_size": 63488 00:16:53.376 }, 00:16:53.376 { 00:16:53.376 "name": "pt2", 00:16:53.376 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:53.376 "is_configured": true, 00:16:53.376 "data_offset": 2048, 00:16:53.376 "data_size": 63488 00:16:53.376 }, 00:16:53.376 { 00:16:53.376 "name": "pt3", 00:16:53.376 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:53.376 "is_configured": true, 00:16:53.376 "data_offset": 2048, 00:16:53.376 "data_size": 63488 00:16:53.376 } 00:16:53.376 ] 00:16:53.376 }' 00:16:53.376 07:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.376 07:22:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:53.944 [2024-07-25 07:22:26.452338] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:53.944 "name": "raid_bdev1", 00:16:53.944 "aliases": [ 00:16:53.944 "8f8682b3-2b40-4376-9120-f5bb40056ceb" 00:16:53.944 ], 00:16:53.944 "product_name": "Raid Volume", 00:16:53.944 "block_size": 512, 00:16:53.944 "num_blocks": 190464, 00:16:53.944 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:16:53.944 "assigned_rate_limits": { 00:16:53.944 "rw_ios_per_sec": 0, 00:16:53.944 "rw_mbytes_per_sec": 0, 00:16:53.944 "r_mbytes_per_sec": 0, 00:16:53.944 "w_mbytes_per_sec": 0 00:16:53.944 }, 00:16:53.944 "claimed": false, 00:16:53.944 "zoned": false, 00:16:53.944 "supported_io_types": { 00:16:53.944 "read": true, 00:16:53.944 "write": true, 00:16:53.944 "unmap": true, 00:16:53.944 "flush": true, 00:16:53.944 "reset": true, 00:16:53.944 "nvme_admin": false, 00:16:53.944 "nvme_io": false, 00:16:53.944 "nvme_io_md": false, 00:16:53.944 "write_zeroes": true, 00:16:53.944 "zcopy": false, 00:16:53.944 "get_zone_info": false, 00:16:53.944 "zone_management": false, 00:16:53.944 "zone_append": false, 00:16:53.944 "compare": false, 00:16:53.944 "compare_and_write": false, 00:16:53.944 "abort": false, 00:16:53.944 "seek_hole": false, 00:16:53.944 "seek_data": false, 00:16:53.944 "copy": false, 00:16:53.944 "nvme_iov_md": false 00:16:53.944 }, 00:16:53.944 "memory_domains": [ 00:16:53.944 { 00:16:53.944 "dma_device_id": "system", 00:16:53.944 "dma_device_type": 1 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.944 "dma_device_type": 2 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "dma_device_id": "system", 00:16:53.944 "dma_device_type": 1 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.944 "dma_device_type": 2 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "dma_device_id": "system", 00:16:53.944 "dma_device_type": 1 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.944 "dma_device_type": 2 00:16:53.944 } 00:16:53.944 ], 00:16:53.944 "driver_specific": { 00:16:53.944 "raid": { 00:16:53.944 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:16:53.944 "strip_size_kb": 64, 00:16:53.944 "state": "online", 00:16:53.944 "raid_level": "concat", 00:16:53.944 "superblock": true, 00:16:53.944 "num_base_bdevs": 3, 00:16:53.944 "num_base_bdevs_discovered": 3, 00:16:53.944 "num_base_bdevs_operational": 3, 00:16:53.944 "base_bdevs_list": [ 00:16:53.944 { 00:16:53.944 "name": "pt1", 00:16:53.944 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:53.944 "is_configured": true, 00:16:53.944 "data_offset": 2048, 00:16:53.944 "data_size": 63488 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "name": "pt2", 00:16:53.944 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:53.944 "is_configured": true, 00:16:53.944 "data_offset": 2048, 00:16:53.944 "data_size": 63488 00:16:53.944 }, 00:16:53.944 { 00:16:53.944 "name": "pt3", 00:16:53.944 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:53.944 "is_configured": true, 00:16:53.944 "data_offset": 2048, 00:16:53.944 "data_size": 63488 00:16:53.944 } 00:16:53.944 ] 00:16:53.944 } 00:16:53.944 } 00:16:53.944 }' 00:16:53.944 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:54.203 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:54.203 pt2 00:16:54.203 pt3' 00:16:54.203 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.203 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:54.203 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.463 "name": "pt1", 00:16:54.463 "aliases": [ 00:16:54.463 "00000000-0000-0000-0000-000000000001" 00:16:54.463 ], 00:16:54.463 "product_name": "passthru", 00:16:54.463 "block_size": 512, 00:16:54.463 "num_blocks": 65536, 00:16:54.463 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.463 "assigned_rate_limits": { 00:16:54.463 "rw_ios_per_sec": 0, 00:16:54.463 "rw_mbytes_per_sec": 0, 00:16:54.463 "r_mbytes_per_sec": 0, 00:16:54.463 "w_mbytes_per_sec": 0 00:16:54.463 }, 00:16:54.463 "claimed": true, 00:16:54.463 "claim_type": "exclusive_write", 00:16:54.463 "zoned": false, 00:16:54.463 "supported_io_types": { 00:16:54.463 "read": true, 00:16:54.463 "write": true, 00:16:54.463 "unmap": true, 00:16:54.463 "flush": true, 00:16:54.463 "reset": true, 00:16:54.463 "nvme_admin": false, 00:16:54.463 "nvme_io": false, 00:16:54.463 "nvme_io_md": false, 00:16:54.463 "write_zeroes": true, 00:16:54.463 "zcopy": true, 00:16:54.463 "get_zone_info": false, 00:16:54.463 "zone_management": false, 00:16:54.463 "zone_append": false, 00:16:54.463 "compare": false, 00:16:54.463 "compare_and_write": false, 00:16:54.463 "abort": true, 00:16:54.463 "seek_hole": false, 00:16:54.463 "seek_data": false, 00:16:54.463 "copy": true, 00:16:54.463 "nvme_iov_md": false 00:16:54.463 }, 00:16:54.463 "memory_domains": [ 00:16:54.463 { 00:16:54.463 "dma_device_id": "system", 00:16:54.463 "dma_device_type": 1 00:16:54.463 }, 00:16:54.463 { 00:16:54.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.463 "dma_device_type": 2 00:16:54.463 } 00:16:54.463 ], 00:16:54.463 "driver_specific": { 00:16:54.463 "passthru": { 00:16:54.463 "name": "pt1", 00:16:54.463 "base_bdev_name": "malloc1" 00:16:54.463 } 00:16:54.463 } 00:16:54.463 }' 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.463 07:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:54.722 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.981 "name": "pt2", 00:16:54.981 "aliases": [ 00:16:54.981 "00000000-0000-0000-0000-000000000002" 00:16:54.981 ], 00:16:54.981 "product_name": "passthru", 00:16:54.981 "block_size": 512, 00:16:54.981 "num_blocks": 65536, 00:16:54.981 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:54.981 "assigned_rate_limits": { 00:16:54.981 "rw_ios_per_sec": 0, 00:16:54.981 "rw_mbytes_per_sec": 0, 00:16:54.981 "r_mbytes_per_sec": 0, 00:16:54.981 "w_mbytes_per_sec": 0 00:16:54.981 }, 00:16:54.981 "claimed": true, 00:16:54.981 "claim_type": "exclusive_write", 00:16:54.981 "zoned": false, 00:16:54.981 "supported_io_types": { 00:16:54.981 "read": true, 00:16:54.981 "write": true, 00:16:54.981 "unmap": true, 00:16:54.981 "flush": true, 00:16:54.981 "reset": true, 00:16:54.981 "nvme_admin": false, 00:16:54.981 "nvme_io": false, 00:16:54.981 "nvme_io_md": false, 00:16:54.981 "write_zeroes": true, 00:16:54.981 "zcopy": true, 00:16:54.981 "get_zone_info": false, 00:16:54.981 "zone_management": false, 00:16:54.981 "zone_append": false, 00:16:54.981 "compare": false, 00:16:54.981 "compare_and_write": false, 00:16:54.981 "abort": true, 00:16:54.981 "seek_hole": false, 00:16:54.981 "seek_data": false, 00:16:54.981 "copy": true, 00:16:54.981 "nvme_iov_md": false 00:16:54.981 }, 00:16:54.981 "memory_domains": [ 00:16:54.981 { 00:16:54.981 "dma_device_id": "system", 00:16:54.981 "dma_device_type": 1 00:16:54.981 }, 00:16:54.981 { 00:16:54.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.981 "dma_device_type": 2 00:16:54.981 } 00:16:54.981 ], 00:16:54.981 "driver_specific": { 00:16:54.981 "passthru": { 00:16:54.981 "name": "pt2", 00:16:54.981 "base_bdev_name": "malloc2" 00:16:54.981 } 00:16:54.981 } 00:16:54.981 }' 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.981 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.240 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.240 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.241 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.241 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.241 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.241 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.241 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:55.241 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.500 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.500 "name": "pt3", 00:16:55.500 "aliases": [ 00:16:55.500 "00000000-0000-0000-0000-000000000003" 00:16:55.500 ], 00:16:55.500 "product_name": "passthru", 00:16:55.500 "block_size": 512, 00:16:55.500 "num_blocks": 65536, 00:16:55.500 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:55.500 "assigned_rate_limits": { 00:16:55.500 "rw_ios_per_sec": 0, 00:16:55.500 "rw_mbytes_per_sec": 0, 00:16:55.500 "r_mbytes_per_sec": 0, 00:16:55.500 "w_mbytes_per_sec": 0 00:16:55.500 }, 00:16:55.500 "claimed": true, 00:16:55.500 "claim_type": "exclusive_write", 00:16:55.500 "zoned": false, 00:16:55.500 "supported_io_types": { 00:16:55.500 "read": true, 00:16:55.500 "write": true, 00:16:55.500 "unmap": true, 00:16:55.500 "flush": true, 00:16:55.500 "reset": true, 00:16:55.500 "nvme_admin": false, 00:16:55.500 "nvme_io": false, 00:16:55.500 "nvme_io_md": false, 00:16:55.500 "write_zeroes": true, 00:16:55.500 "zcopy": true, 00:16:55.500 "get_zone_info": false, 00:16:55.500 "zone_management": false, 00:16:55.500 "zone_append": false, 00:16:55.500 "compare": false, 00:16:55.500 "compare_and_write": false, 00:16:55.500 "abort": true, 00:16:55.500 "seek_hole": false, 00:16:55.500 "seek_data": false, 00:16:55.500 "copy": true, 00:16:55.500 "nvme_iov_md": false 00:16:55.500 }, 00:16:55.500 "memory_domains": [ 00:16:55.500 { 00:16:55.500 "dma_device_id": "system", 00:16:55.500 "dma_device_type": 1 00:16:55.500 }, 00:16:55.500 { 00:16:55.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.500 "dma_device_type": 2 00:16:55.500 } 00:16:55.500 ], 00:16:55.500 "driver_specific": { 00:16:55.500 "passthru": { 00:16:55.500 "name": "pt3", 00:16:55.500 "base_bdev_name": "malloc3" 00:16:55.500 } 00:16:55.500 } 00:16:55.500 }' 00:16:55.500 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.500 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.500 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.500 07:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.500 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:55.760 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:16:56.019 [2024-07-25 07:22:28.417514] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:56.019 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=8f8682b3-2b40-4376-9120-f5bb40056ceb 00:16:56.019 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 8f8682b3-2b40-4376-9120-f5bb40056ceb ']' 00:16:56.019 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:56.278 [2024-07-25 07:22:28.641837] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:56.278 [2024-07-25 07:22:28.641858] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:56.278 [2024-07-25 07:22:28.641903] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:56.278 [2024-07-25 07:22:28.641950] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:56.278 [2024-07-25 07:22:28.641961] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274f330 name raid_bdev1, state offline 00:16:56.278 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.278 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:16:56.538 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:16:56.538 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:16:56.538 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.538 07:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:56.797 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.797 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:56.797 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.797 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:57.057 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:57.057 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:57.316 07:22:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:57.575 [2024-07-25 07:22:29.997352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:57.575 [2024-07-25 07:22:29.998601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:57.575 [2024-07-25 07:22:29.998641] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:57.575 [2024-07-25 07:22:29.998682] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:57.575 [2024-07-25 07:22:29.998717] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:57.575 [2024-07-25 07:22:29.998738] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:57.575 [2024-07-25 07:22:29.998755] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:57.575 [2024-07-25 07:22:29.998764] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274ce30 name raid_bdev1, state configuring 00:16:57.575 request: 00:16:57.575 { 00:16:57.575 "name": "raid_bdev1", 00:16:57.575 "raid_level": "concat", 00:16:57.575 "base_bdevs": [ 00:16:57.575 "malloc1", 00:16:57.575 "malloc2", 00:16:57.575 "malloc3" 00:16:57.575 ], 00:16:57.575 "strip_size_kb": 64, 00:16:57.575 "superblock": false, 00:16:57.575 "method": "bdev_raid_create", 00:16:57.575 "req_id": 1 00:16:57.575 } 00:16:57.575 Got JSON-RPC error response 00:16:57.575 response: 00:16:57.575 { 00:16:57.575 "code": -17, 00:16:57.575 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:57.575 } 00:16:57.575 07:22:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:16:57.575 07:22:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:16:57.575 07:22:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:16:57.575 07:22:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:16:57.575 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.575 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:16:57.834 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:16:57.834 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:16:57.834 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:58.094 [2024-07-25 07:22:30.458557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:58.094 [2024-07-25 07:22:30.458601] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.094 [2024-07-25 07:22:30.458618] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a0e90 00:16:58.094 [2024-07-25 07:22:30.458630] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.094 [2024-07-25 07:22:30.460108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.094 [2024-07-25 07:22:30.460134] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:58.094 [2024-07-25 07:22:30.460212] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:58.094 [2024-07-25 07:22:30.460237] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.094 pt1 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.094 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.354 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.354 "name": "raid_bdev1", 00:16:58.354 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:16:58.354 "strip_size_kb": 64, 00:16:58.354 "state": "configuring", 00:16:58.354 "raid_level": "concat", 00:16:58.354 "superblock": true, 00:16:58.354 "num_base_bdevs": 3, 00:16:58.354 "num_base_bdevs_discovered": 1, 00:16:58.354 "num_base_bdevs_operational": 3, 00:16:58.354 "base_bdevs_list": [ 00:16:58.354 { 00:16:58.354 "name": "pt1", 00:16:58.354 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.354 "is_configured": true, 00:16:58.354 "data_offset": 2048, 00:16:58.354 "data_size": 63488 00:16:58.354 }, 00:16:58.354 { 00:16:58.354 "name": null, 00:16:58.354 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.354 "is_configured": false, 00:16:58.354 "data_offset": 2048, 00:16:58.354 "data_size": 63488 00:16:58.354 }, 00:16:58.354 { 00:16:58.354 "name": null, 00:16:58.354 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.354 "is_configured": false, 00:16:58.354 "data_offset": 2048, 00:16:58.354 "data_size": 63488 00:16:58.354 } 00:16:58.354 ] 00:16:58.354 }' 00:16:58.354 07:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.354 07:22:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.922 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:16:58.922 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:59.181 [2024-07-25 07:22:31.489295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:59.182 [2024-07-25 07:22:31.489340] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:59.182 [2024-07-25 07:22:31.489356] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2749490 00:16:59.182 [2024-07-25 07:22:31.489367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:59.182 [2024-07-25 07:22:31.489668] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:59.182 [2024-07-25 07:22:31.489684] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:59.182 [2024-07-25 07:22:31.489742] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:59.182 [2024-07-25 07:22:31.489761] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:59.182 pt2 00:16:59.182 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:59.182 [2024-07-25 07:22:31.709875] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:59.441 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.441 "name": "raid_bdev1", 00:16:59.441 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:16:59.441 "strip_size_kb": 64, 00:16:59.441 "state": "configuring", 00:16:59.441 "raid_level": "concat", 00:16:59.441 "superblock": true, 00:16:59.441 "num_base_bdevs": 3, 00:16:59.442 "num_base_bdevs_discovered": 1, 00:16:59.442 "num_base_bdevs_operational": 3, 00:16:59.442 "base_bdevs_list": [ 00:16:59.442 { 00:16:59.442 "name": "pt1", 00:16:59.442 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.442 "is_configured": true, 00:16:59.442 "data_offset": 2048, 00:16:59.442 "data_size": 63488 00:16:59.442 }, 00:16:59.442 { 00:16:59.442 "name": null, 00:16:59.442 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.442 "is_configured": false, 00:16:59.442 "data_offset": 2048, 00:16:59.442 "data_size": 63488 00:16:59.442 }, 00:16:59.442 { 00:16:59.442 "name": null, 00:16:59.442 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.442 "is_configured": false, 00:16:59.442 "data_offset": 2048, 00:16:59.442 "data_size": 63488 00:16:59.442 } 00:16:59.442 ] 00:16:59.442 }' 00:16:59.442 07:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.442 07:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.380 07:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:17:00.380 07:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:00.380 07:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:00.380 [2024-07-25 07:22:32.756623] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:00.380 [2024-07-25 07:22:32.756665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.380 [2024-07-25 07:22:32.756682] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2750570 00:17:00.380 [2024-07-25 07:22:32.756694] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.380 [2024-07-25 07:22:32.756988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.380 [2024-07-25 07:22:32.757004] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:00.380 [2024-07-25 07:22:32.757058] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:00.380 [2024-07-25 07:22:32.757076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:00.380 pt2 00:17:00.380 07:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:00.380 07:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:00.380 07:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:00.639 [2024-07-25 07:22:32.985227] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:00.639 [2024-07-25 07:22:32.985257] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.639 [2024-07-25 07:22:32.985271] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274e040 00:17:00.639 [2024-07-25 07:22:32.985282] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.639 [2024-07-25 07:22:32.985536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.639 [2024-07-25 07:22:32.985551] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:00.639 [2024-07-25 07:22:32.985596] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:00.639 [2024-07-25 07:22:32.985611] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:00.639 [2024-07-25 07:22:32.985701] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x274fef0 00:17:00.639 [2024-07-25 07:22:32.985711] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:00.639 [2024-07-25 07:22:32.985860] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27515f0 00:17:00.639 [2024-07-25 07:22:32.985972] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x274fef0 00:17:00.639 [2024-07-25 07:22:32.985981] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x274fef0 00:17:00.639 [2024-07-25 07:22:32.986066] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:00.639 pt3 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.639 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:00.898 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.898 "name": "raid_bdev1", 00:17:00.898 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:17:00.898 "strip_size_kb": 64, 00:17:00.898 "state": "online", 00:17:00.898 "raid_level": "concat", 00:17:00.898 "superblock": true, 00:17:00.898 "num_base_bdevs": 3, 00:17:00.898 "num_base_bdevs_discovered": 3, 00:17:00.898 "num_base_bdevs_operational": 3, 00:17:00.898 "base_bdevs_list": [ 00:17:00.898 { 00:17:00.898 "name": "pt1", 00:17:00.898 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:00.898 "is_configured": true, 00:17:00.898 "data_offset": 2048, 00:17:00.898 "data_size": 63488 00:17:00.898 }, 00:17:00.898 { 00:17:00.898 "name": "pt2", 00:17:00.898 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.898 "is_configured": true, 00:17:00.898 "data_offset": 2048, 00:17:00.898 "data_size": 63488 00:17:00.898 }, 00:17:00.898 { 00:17:00.898 "name": "pt3", 00:17:00.898 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.898 "is_configured": true, 00:17:00.898 "data_offset": 2048, 00:17:00.898 "data_size": 63488 00:17:00.898 } 00:17:00.898 ] 00:17:00.898 }' 00:17:00.898 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.898 07:22:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:01.466 07:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.725 [2024-07-25 07:22:34.040301] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.725 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:01.725 "name": "raid_bdev1", 00:17:01.725 "aliases": [ 00:17:01.725 "8f8682b3-2b40-4376-9120-f5bb40056ceb" 00:17:01.725 ], 00:17:01.725 "product_name": "Raid Volume", 00:17:01.725 "block_size": 512, 00:17:01.725 "num_blocks": 190464, 00:17:01.725 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:17:01.725 "assigned_rate_limits": { 00:17:01.725 "rw_ios_per_sec": 0, 00:17:01.725 "rw_mbytes_per_sec": 0, 00:17:01.725 "r_mbytes_per_sec": 0, 00:17:01.725 "w_mbytes_per_sec": 0 00:17:01.725 }, 00:17:01.725 "claimed": false, 00:17:01.725 "zoned": false, 00:17:01.725 "supported_io_types": { 00:17:01.725 "read": true, 00:17:01.725 "write": true, 00:17:01.725 "unmap": true, 00:17:01.725 "flush": true, 00:17:01.725 "reset": true, 00:17:01.725 "nvme_admin": false, 00:17:01.725 "nvme_io": false, 00:17:01.725 "nvme_io_md": false, 00:17:01.725 "write_zeroes": true, 00:17:01.725 "zcopy": false, 00:17:01.725 "get_zone_info": false, 00:17:01.725 "zone_management": false, 00:17:01.725 "zone_append": false, 00:17:01.725 "compare": false, 00:17:01.725 "compare_and_write": false, 00:17:01.725 "abort": false, 00:17:01.725 "seek_hole": false, 00:17:01.725 "seek_data": false, 00:17:01.725 "copy": false, 00:17:01.725 "nvme_iov_md": false 00:17:01.725 }, 00:17:01.725 "memory_domains": [ 00:17:01.725 { 00:17:01.725 "dma_device_id": "system", 00:17:01.725 "dma_device_type": 1 00:17:01.725 }, 00:17:01.725 { 00:17:01.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.725 "dma_device_type": 2 00:17:01.725 }, 00:17:01.725 { 00:17:01.725 "dma_device_id": "system", 00:17:01.725 "dma_device_type": 1 00:17:01.725 }, 00:17:01.725 { 00:17:01.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.726 "dma_device_type": 2 00:17:01.726 }, 00:17:01.726 { 00:17:01.726 "dma_device_id": "system", 00:17:01.726 "dma_device_type": 1 00:17:01.726 }, 00:17:01.726 { 00:17:01.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.726 "dma_device_type": 2 00:17:01.726 } 00:17:01.726 ], 00:17:01.726 "driver_specific": { 00:17:01.726 "raid": { 00:17:01.726 "uuid": "8f8682b3-2b40-4376-9120-f5bb40056ceb", 00:17:01.726 "strip_size_kb": 64, 00:17:01.726 "state": "online", 00:17:01.726 "raid_level": "concat", 00:17:01.726 "superblock": true, 00:17:01.726 "num_base_bdevs": 3, 00:17:01.726 "num_base_bdevs_discovered": 3, 00:17:01.726 "num_base_bdevs_operational": 3, 00:17:01.726 "base_bdevs_list": [ 00:17:01.726 { 00:17:01.726 "name": "pt1", 00:17:01.726 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.726 "is_configured": true, 00:17:01.726 "data_offset": 2048, 00:17:01.726 "data_size": 63488 00:17:01.726 }, 00:17:01.726 { 00:17:01.726 "name": "pt2", 00:17:01.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.726 "is_configured": true, 00:17:01.726 "data_offset": 2048, 00:17:01.726 "data_size": 63488 00:17:01.726 }, 00:17:01.726 { 00:17:01.726 "name": "pt3", 00:17:01.726 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.726 "is_configured": true, 00:17:01.726 "data_offset": 2048, 00:17:01.726 "data_size": 63488 00:17:01.726 } 00:17:01.726 ] 00:17:01.726 } 00:17:01.726 } 00:17:01.726 }' 00:17:01.726 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:01.726 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:01.726 pt2 00:17:01.726 pt3' 00:17:01.726 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.726 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:01.726 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.984 "name": "pt1", 00:17:01.984 "aliases": [ 00:17:01.984 "00000000-0000-0000-0000-000000000001" 00:17:01.984 ], 00:17:01.984 "product_name": "passthru", 00:17:01.984 "block_size": 512, 00:17:01.984 "num_blocks": 65536, 00:17:01.984 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.984 "assigned_rate_limits": { 00:17:01.984 "rw_ios_per_sec": 0, 00:17:01.984 "rw_mbytes_per_sec": 0, 00:17:01.984 "r_mbytes_per_sec": 0, 00:17:01.984 "w_mbytes_per_sec": 0 00:17:01.984 }, 00:17:01.984 "claimed": true, 00:17:01.984 "claim_type": "exclusive_write", 00:17:01.984 "zoned": false, 00:17:01.984 "supported_io_types": { 00:17:01.984 "read": true, 00:17:01.984 "write": true, 00:17:01.984 "unmap": true, 00:17:01.984 "flush": true, 00:17:01.984 "reset": true, 00:17:01.984 "nvme_admin": false, 00:17:01.984 "nvme_io": false, 00:17:01.984 "nvme_io_md": false, 00:17:01.984 "write_zeroes": true, 00:17:01.984 "zcopy": true, 00:17:01.984 "get_zone_info": false, 00:17:01.984 "zone_management": false, 00:17:01.984 "zone_append": false, 00:17:01.984 "compare": false, 00:17:01.984 "compare_and_write": false, 00:17:01.984 "abort": true, 00:17:01.984 "seek_hole": false, 00:17:01.984 "seek_data": false, 00:17:01.984 "copy": true, 00:17:01.984 "nvme_iov_md": false 00:17:01.984 }, 00:17:01.984 "memory_domains": [ 00:17:01.984 { 00:17:01.984 "dma_device_id": "system", 00:17:01.984 "dma_device_type": 1 00:17:01.984 }, 00:17:01.984 { 00:17:01.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.984 "dma_device_type": 2 00:17:01.984 } 00:17:01.984 ], 00:17:01.984 "driver_specific": { 00:17:01.984 "passthru": { 00:17:01.984 "name": "pt1", 00:17:01.984 "base_bdev_name": "malloc1" 00:17:01.984 } 00:17:01.984 } 00:17:01.984 }' 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.984 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:02.243 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.502 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.502 "name": "pt2", 00:17:02.502 "aliases": [ 00:17:02.502 "00000000-0000-0000-0000-000000000002" 00:17:02.502 ], 00:17:02.502 "product_name": "passthru", 00:17:02.502 "block_size": 512, 00:17:02.502 "num_blocks": 65536, 00:17:02.502 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.502 "assigned_rate_limits": { 00:17:02.502 "rw_ios_per_sec": 0, 00:17:02.502 "rw_mbytes_per_sec": 0, 00:17:02.502 "r_mbytes_per_sec": 0, 00:17:02.502 "w_mbytes_per_sec": 0 00:17:02.502 }, 00:17:02.502 "claimed": true, 00:17:02.502 "claim_type": "exclusive_write", 00:17:02.502 "zoned": false, 00:17:02.502 "supported_io_types": { 00:17:02.502 "read": true, 00:17:02.502 "write": true, 00:17:02.502 "unmap": true, 00:17:02.502 "flush": true, 00:17:02.502 "reset": true, 00:17:02.502 "nvme_admin": false, 00:17:02.502 "nvme_io": false, 00:17:02.502 "nvme_io_md": false, 00:17:02.502 "write_zeroes": true, 00:17:02.502 "zcopy": true, 00:17:02.502 "get_zone_info": false, 00:17:02.502 "zone_management": false, 00:17:02.502 "zone_append": false, 00:17:02.502 "compare": false, 00:17:02.502 "compare_and_write": false, 00:17:02.502 "abort": true, 00:17:02.502 "seek_hole": false, 00:17:02.502 "seek_data": false, 00:17:02.502 "copy": true, 00:17:02.502 "nvme_iov_md": false 00:17:02.502 }, 00:17:02.502 "memory_domains": [ 00:17:02.502 { 00:17:02.502 "dma_device_id": "system", 00:17:02.502 "dma_device_type": 1 00:17:02.502 }, 00:17:02.502 { 00:17:02.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.502 "dma_device_type": 2 00:17:02.502 } 00:17:02.502 ], 00:17:02.502 "driver_specific": { 00:17:02.502 "passthru": { 00:17:02.502 "name": "pt2", 00:17:02.502 "base_bdev_name": "malloc2" 00:17:02.502 } 00:17:02.502 } 00:17:02.502 }' 00:17:02.502 07:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.502 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.761 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.019 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.019 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.019 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.019 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:03.019 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.587 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.587 "name": "pt3", 00:17:03.587 "aliases": [ 00:17:03.587 "00000000-0000-0000-0000-000000000003" 00:17:03.587 ], 00:17:03.587 "product_name": "passthru", 00:17:03.587 "block_size": 512, 00:17:03.587 "num_blocks": 65536, 00:17:03.587 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.587 "assigned_rate_limits": { 00:17:03.587 "rw_ios_per_sec": 0, 00:17:03.587 "rw_mbytes_per_sec": 0, 00:17:03.587 "r_mbytes_per_sec": 0, 00:17:03.587 "w_mbytes_per_sec": 0 00:17:03.587 }, 00:17:03.587 "claimed": true, 00:17:03.587 "claim_type": "exclusive_write", 00:17:03.587 "zoned": false, 00:17:03.587 "supported_io_types": { 00:17:03.587 "read": true, 00:17:03.587 "write": true, 00:17:03.587 "unmap": true, 00:17:03.587 "flush": true, 00:17:03.587 "reset": true, 00:17:03.587 "nvme_admin": false, 00:17:03.587 "nvme_io": false, 00:17:03.587 "nvme_io_md": false, 00:17:03.587 "write_zeroes": true, 00:17:03.587 "zcopy": true, 00:17:03.587 "get_zone_info": false, 00:17:03.587 "zone_management": false, 00:17:03.587 "zone_append": false, 00:17:03.587 "compare": false, 00:17:03.587 "compare_and_write": false, 00:17:03.587 "abort": true, 00:17:03.587 "seek_hole": false, 00:17:03.587 "seek_data": false, 00:17:03.587 "copy": true, 00:17:03.587 "nvme_iov_md": false 00:17:03.587 }, 00:17:03.587 "memory_domains": [ 00:17:03.587 { 00:17:03.587 "dma_device_id": "system", 00:17:03.587 "dma_device_type": 1 00:17:03.587 }, 00:17:03.587 { 00:17:03.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.587 "dma_device_type": 2 00:17:03.587 } 00:17:03.587 ], 00:17:03.587 "driver_specific": { 00:17:03.587 "passthru": { 00:17:03.587 "name": "pt3", 00:17:03.587 "base_bdev_name": "malloc3" 00:17:03.587 } 00:17:03.587 } 00:17:03.587 }' 00:17:03.587 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.587 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.587 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.587 07:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.587 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.587 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.587 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.587 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.847 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.847 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.847 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.847 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.847 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:03.847 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:17:04.106 [2024-07-25 07:22:36.490784] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 8f8682b3-2b40-4376-9120-f5bb40056ceb '!=' 8f8682b3-2b40-4376-9120-f5bb40056ceb ']' 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1638073 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1638073 ']' 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1638073 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1638073 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1638073' 00:17:04.106 killing process with pid 1638073 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1638073 00:17:04.106 [2024-07-25 07:22:36.564120] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:04.106 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1638073 00:17:04.106 [2024-07-25 07:22:36.564174] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:04.106 [2024-07-25 07:22:36.564227] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:04.106 [2024-07-25 07:22:36.564238] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274fef0 name raid_bdev1, state offline 00:17:04.106 [2024-07-25 07:22:36.587929] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:04.368 07:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:17:04.368 00:17:04.368 real 0m13.831s 00:17:04.368 user 0m25.073s 00:17:04.368 sys 0m2.364s 00:17:04.368 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:04.368 07:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.368 ************************************ 00:17:04.368 END TEST raid_superblock_test 00:17:04.368 ************************************ 00:17:04.368 07:22:36 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:17:04.368 07:22:36 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:04.368 07:22:36 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:04.368 07:22:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:04.368 ************************************ 00:17:04.368 START TEST raid_read_error_test 00:17:04.368 ************************************ 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 read 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.TKIW2gdcMR 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1640740 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1640740 /var/tmp/spdk-raid.sock 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1640740 ']' 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:04.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.368 07:22:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:04.689 [2024-07-25 07:22:36.911163] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:17:04.689 [2024-07-25 07:22:36.911220] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640740 ] 00:17:04.689 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.689 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:04.689 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.689 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:04.689 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.689 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:04.689 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.689 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:04.690 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:04.690 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:04.690 [2024-07-25 07:22:37.040624] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:04.690 [2024-07-25 07:22:37.127095] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:04.690 [2024-07-25 07:22:37.179853] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:04.690 [2024-07-25 07:22:37.179884] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:05.265 07:22:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:05.265 07:22:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:05.265 07:22:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:05.265 07:22:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:05.524 BaseBdev1_malloc 00:17:05.525 07:22:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:05.783 true 00:17:05.783 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:05.783 [2024-07-25 07:22:38.314878] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:05.783 [2024-07-25 07:22:38.314918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.783 [2024-07-25 07:22:38.314936] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x873a50 00:17:05.783 [2024-07-25 07:22:38.314947] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.783 [2024-07-25 07:22:38.316432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.783 [2024-07-25 07:22:38.316459] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:06.042 BaseBdev1 00:17:06.042 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:06.042 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:06.042 BaseBdev2_malloc 00:17:06.042 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:06.302 true 00:17:06.302 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:06.561 [2024-07-25 07:22:38.852696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:06.561 [2024-07-25 07:22:38.852735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.561 [2024-07-25 07:22:38.852753] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa1cf40 00:17:06.561 [2024-07-25 07:22:38.852764] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.561 [2024-07-25 07:22:38.854158] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.561 [2024-07-25 07:22:38.854184] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:06.561 BaseBdev2 00:17:06.561 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:06.561 07:22:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:06.561 BaseBdev3_malloc 00:17:06.561 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:06.820 true 00:17:06.820 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:07.079 [2024-07-25 07:22:39.382247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:07.079 [2024-07-25 07:22:39.382284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.079 [2024-07-25 07:22:39.382306] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa20250 00:17:07.079 [2024-07-25 07:22:39.382318] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.079 [2024-07-25 07:22:39.383700] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.079 [2024-07-25 07:22:39.383726] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:07.079 BaseBdev3 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:07.079 [2024-07-25 07:22:39.594833] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:07.079 [2024-07-25 07:22:39.595999] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:07.079 [2024-07-25 07:22:39.596063] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:07.079 [2024-07-25 07:22:39.596266] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa21010 00:17:07.079 [2024-07-25 07:22:39.596277] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:07.079 [2024-07-25 07:22:39.596452] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x86fcb0 00:17:07.079 [2024-07-25 07:22:39.596586] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa21010 00:17:07.079 [2024-07-25 07:22:39.596595] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa21010 00:17:07.079 [2024-07-25 07:22:39.596687] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.079 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.338 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.338 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:07.338 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.338 "name": "raid_bdev1", 00:17:07.338 "uuid": "d88787a0-445b-4a8a-808e-39bb1578a5c1", 00:17:07.338 "strip_size_kb": 64, 00:17:07.338 "state": "online", 00:17:07.338 "raid_level": "concat", 00:17:07.338 "superblock": true, 00:17:07.338 "num_base_bdevs": 3, 00:17:07.338 "num_base_bdevs_discovered": 3, 00:17:07.338 "num_base_bdevs_operational": 3, 00:17:07.338 "base_bdevs_list": [ 00:17:07.338 { 00:17:07.338 "name": "BaseBdev1", 00:17:07.338 "uuid": "1cb25eee-11e2-537d-9ec3-c9959b786704", 00:17:07.338 "is_configured": true, 00:17:07.338 "data_offset": 2048, 00:17:07.338 "data_size": 63488 00:17:07.338 }, 00:17:07.338 { 00:17:07.338 "name": "BaseBdev2", 00:17:07.338 "uuid": "61100e3b-b6da-58e3-847c-91fe9d19a169", 00:17:07.338 "is_configured": true, 00:17:07.338 "data_offset": 2048, 00:17:07.338 "data_size": 63488 00:17:07.338 }, 00:17:07.338 { 00:17:07.338 "name": "BaseBdev3", 00:17:07.338 "uuid": "0c58c242-702e-52a5-b353-820c7ced9652", 00:17:07.338 "is_configured": true, 00:17:07.338 "data_offset": 2048, 00:17:07.338 "data_size": 63488 00:17:07.338 } 00:17:07.338 ] 00:17:07.338 }' 00:17:07.338 07:22:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.338 07:22:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.907 07:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:17:07.907 07:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:08.166 [2024-07-25 07:22:40.501463] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x86f430 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.104 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:09.363 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.363 "name": "raid_bdev1", 00:17:09.363 "uuid": "d88787a0-445b-4a8a-808e-39bb1578a5c1", 00:17:09.363 "strip_size_kb": 64, 00:17:09.363 "state": "online", 00:17:09.363 "raid_level": "concat", 00:17:09.363 "superblock": true, 00:17:09.363 "num_base_bdevs": 3, 00:17:09.363 "num_base_bdevs_discovered": 3, 00:17:09.363 "num_base_bdevs_operational": 3, 00:17:09.363 "base_bdevs_list": [ 00:17:09.363 { 00:17:09.363 "name": "BaseBdev1", 00:17:09.363 "uuid": "1cb25eee-11e2-537d-9ec3-c9959b786704", 00:17:09.363 "is_configured": true, 00:17:09.363 "data_offset": 2048, 00:17:09.363 "data_size": 63488 00:17:09.363 }, 00:17:09.363 { 00:17:09.363 "name": "BaseBdev2", 00:17:09.363 "uuid": "61100e3b-b6da-58e3-847c-91fe9d19a169", 00:17:09.363 "is_configured": true, 00:17:09.363 "data_offset": 2048, 00:17:09.363 "data_size": 63488 00:17:09.363 }, 00:17:09.363 { 00:17:09.363 "name": "BaseBdev3", 00:17:09.363 "uuid": "0c58c242-702e-52a5-b353-820c7ced9652", 00:17:09.363 "is_configured": true, 00:17:09.363 "data_offset": 2048, 00:17:09.363 "data_size": 63488 00:17:09.363 } 00:17:09.363 ] 00:17:09.363 }' 00:17:09.363 07:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.363 07:22:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.931 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:10.190 [2024-07-25 07:22:42.511398] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:10.190 [2024-07-25 07:22:42.511433] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:10.190 [2024-07-25 07:22:42.514349] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:10.190 [2024-07-25 07:22:42.514381] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.190 [2024-07-25 07:22:42.514412] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:10.190 [2024-07-25 07:22:42.514422] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa21010 name raid_bdev1, state offline 00:17:10.190 0 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1640740 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1640740 ']' 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1640740 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1640740 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1640740' 00:17:10.190 killing process with pid 1640740 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1640740 00:17:10.190 [2024-07-25 07:22:42.588343] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:10.190 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1640740 00:17:10.190 [2024-07-25 07:22:42.606907] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.TKIW2gdcMR 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:17:10.451 00:17:10.451 real 0m5.974s 00:17:10.451 user 0m9.299s 00:17:10.451 sys 0m1.047s 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:10.451 07:22:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.451 ************************************ 00:17:10.451 END TEST raid_read_error_test 00:17:10.451 ************************************ 00:17:10.451 07:22:42 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:17:10.451 07:22:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:10.451 07:22:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:10.451 07:22:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:10.451 ************************************ 00:17:10.451 START TEST raid_write_error_test 00:17:10.451 ************************************ 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 write 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.zBI6EDUSbK 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1641817 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1641817 /var/tmp/spdk-raid.sock 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1641817 ']' 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:10.451 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:10.451 07:22:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.451 [2024-07-25 07:22:42.970162] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:17:10.451 [2024-07-25 07:22:42.970208] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641817 ] 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.711 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:10.711 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:10.712 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:10.712 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:10.712 [2024-07-25 07:22:43.089151] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:10.712 [2024-07-25 07:22:43.171756] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:10.712 [2024-07-25 07:22:43.231622] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.712 [2024-07-25 07:22:43.231659] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:11.652 07:22:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:11.652 07:22:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:11.652 07:22:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:11.652 07:22:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:11.652 BaseBdev1_malloc 00:17:11.652 07:22:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:11.910 true 00:17:11.910 07:22:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:12.169 [2024-07-25 07:22:44.552729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:12.169 [2024-07-25 07:22:44.552770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.169 [2024-07-25 07:22:44.552787] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac8a50 00:17:12.169 [2024-07-25 07:22:44.552802] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.169 [2024-07-25 07:22:44.554189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.169 [2024-07-25 07:22:44.554216] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:12.169 BaseBdev1 00:17:12.169 07:22:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:12.169 07:22:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:12.428 BaseBdev2_malloc 00:17:12.428 07:22:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:12.687 true 00:17:12.687 07:22:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:12.946 [2024-07-25 07:22:45.238764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:12.946 [2024-07-25 07:22:45.238802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.946 [2024-07-25 07:22:45.238819] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c71f40 00:17:12.946 [2024-07-25 07:22:45.238830] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.946 [2024-07-25 07:22:45.240090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.946 [2024-07-25 07:22:45.240116] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:12.946 BaseBdev2 00:17:12.946 07:22:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:12.946 07:22:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:12.946 BaseBdev3_malloc 00:17:13.204 07:22:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:13.204 true 00:17:13.204 07:22:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:13.463 [2024-07-25 07:22:45.916810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:13.463 [2024-07-25 07:22:45.916846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.463 [2024-07-25 07:22:45.916863] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c75250 00:17:13.463 [2024-07-25 07:22:45.916873] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.463 [2024-07-25 07:22:45.918173] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.463 [2024-07-25 07:22:45.918198] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:13.463 BaseBdev3 00:17:13.463 07:22:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:13.722 [2024-07-25 07:22:46.153531] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.722 [2024-07-25 07:22:46.154602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:13.722 [2024-07-25 07:22:46.154664] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:13.722 [2024-07-25 07:22:46.154854] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c76010 00:17:13.722 [2024-07-25 07:22:46.154865] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:13.722 [2024-07-25 07:22:46.155024] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac4cb0 00:17:13.722 [2024-07-25 07:22:46.155168] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c76010 00:17:13.722 [2024-07-25 07:22:46.155178] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c76010 00:17:13.722 [2024-07-25 07:22:46.155266] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.722 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:13.983 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.983 "name": "raid_bdev1", 00:17:13.983 "uuid": "ac463166-d01c-4782-bd59-c124ca95f494", 00:17:13.983 "strip_size_kb": 64, 00:17:13.983 "state": "online", 00:17:13.983 "raid_level": "concat", 00:17:13.983 "superblock": true, 00:17:13.983 "num_base_bdevs": 3, 00:17:13.983 "num_base_bdevs_discovered": 3, 00:17:13.983 "num_base_bdevs_operational": 3, 00:17:13.983 "base_bdevs_list": [ 00:17:13.983 { 00:17:13.983 "name": "BaseBdev1", 00:17:13.983 "uuid": "c83056dd-b1f3-5e56-801c-c7b4ddf4f218", 00:17:13.983 "is_configured": true, 00:17:13.983 "data_offset": 2048, 00:17:13.983 "data_size": 63488 00:17:13.983 }, 00:17:13.983 { 00:17:13.983 "name": "BaseBdev2", 00:17:13.983 "uuid": "7e81c4a2-2ca9-574e-9f17-ea5b5fa08cc8", 00:17:13.983 "is_configured": true, 00:17:13.983 "data_offset": 2048, 00:17:13.983 "data_size": 63488 00:17:13.983 }, 00:17:13.983 { 00:17:13.983 "name": "BaseBdev3", 00:17:13.983 "uuid": "f54d6919-180e-5b7e-bef0-59902e9d0a53", 00:17:13.983 "is_configured": true, 00:17:13.983 "data_offset": 2048, 00:17:13.983 "data_size": 63488 00:17:13.983 } 00:17:13.983 ] 00:17:13.983 }' 00:17:13.983 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.983 07:22:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.549 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:17:14.549 07:22:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:14.550 [2024-07-25 07:22:47.036092] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac4430 00:17:15.483 07:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.742 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:16.000 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.000 "name": "raid_bdev1", 00:17:16.000 "uuid": "ac463166-d01c-4782-bd59-c124ca95f494", 00:17:16.000 "strip_size_kb": 64, 00:17:16.000 "state": "online", 00:17:16.000 "raid_level": "concat", 00:17:16.000 "superblock": true, 00:17:16.000 "num_base_bdevs": 3, 00:17:16.000 "num_base_bdevs_discovered": 3, 00:17:16.000 "num_base_bdevs_operational": 3, 00:17:16.000 "base_bdevs_list": [ 00:17:16.000 { 00:17:16.000 "name": "BaseBdev1", 00:17:16.000 "uuid": "c83056dd-b1f3-5e56-801c-c7b4ddf4f218", 00:17:16.000 "is_configured": true, 00:17:16.000 "data_offset": 2048, 00:17:16.000 "data_size": 63488 00:17:16.000 }, 00:17:16.000 { 00:17:16.000 "name": "BaseBdev2", 00:17:16.000 "uuid": "7e81c4a2-2ca9-574e-9f17-ea5b5fa08cc8", 00:17:16.000 "is_configured": true, 00:17:16.000 "data_offset": 2048, 00:17:16.000 "data_size": 63488 00:17:16.000 }, 00:17:16.000 { 00:17:16.000 "name": "BaseBdev3", 00:17:16.000 "uuid": "f54d6919-180e-5b7e-bef0-59902e9d0a53", 00:17:16.000 "is_configured": true, 00:17:16.000 "data_offset": 2048, 00:17:16.000 "data_size": 63488 00:17:16.000 } 00:17:16.000 ] 00:17:16.000 }' 00:17:16.000 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.000 07:22:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.569 07:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:16.829 [2024-07-25 07:22:49.178525] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:16.829 [2024-07-25 07:22:49.178559] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:16.829 [2024-07-25 07:22:49.181496] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:16.829 [2024-07-25 07:22:49.181531] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.829 [2024-07-25 07:22:49.181561] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:16.829 [2024-07-25 07:22:49.181571] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c76010 name raid_bdev1, state offline 00:17:16.829 0 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1641817 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1641817 ']' 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1641817 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1641817 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1641817' 00:17:16.829 killing process with pid 1641817 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1641817 00:17:16.829 [2024-07-25 07:22:49.257220] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:16.829 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1641817 00:17:16.829 [2024-07-25 07:22:49.275531] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.zBI6EDUSbK 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:17:17.089 00:17:17.089 real 0m6.581s 00:17:17.089 user 0m10.385s 00:17:17.089 sys 0m1.133s 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:17.089 07:22:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.089 ************************************ 00:17:17.089 END TEST raid_write_error_test 00:17:17.089 ************************************ 00:17:17.089 07:22:49 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:17:17.089 07:22:49 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:17:17.089 07:22:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:17.089 07:22:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:17.089 07:22:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:17.089 ************************************ 00:17:17.089 START TEST raid_state_function_test 00:17:17.089 ************************************ 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 false 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1643056 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1643056' 00:17:17.089 Process raid pid: 1643056 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1643056 /var/tmp/spdk-raid.sock 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1643056 ']' 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:17.089 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:17.089 07:22:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.349 [2024-07-25 07:22:49.638716] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:17:17.349 [2024-07-25 07:22:49.638777] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:17.349 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.349 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:17.350 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:17.350 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:17.350 [2024-07-25 07:22:49.774386] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:17.350 [2024-07-25 07:22:49.856495] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:17.609 [2024-07-25 07:22:49.917327] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.609 [2024-07-25 07:22:49.917361] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:18.255 [2024-07-25 07:22:50.744497] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:18.255 [2024-07-25 07:22:50.744540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:18.255 [2024-07-25 07:22:50.744551] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:18.255 [2024-07-25 07:22:50.744561] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:18.255 [2024-07-25 07:22:50.744569] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:18.255 [2024-07-25 07:22:50.744580] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.255 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.514 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.514 "name": "Existed_Raid", 00:17:18.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.514 "strip_size_kb": 0, 00:17:18.514 "state": "configuring", 00:17:18.514 "raid_level": "raid1", 00:17:18.514 "superblock": false, 00:17:18.514 "num_base_bdevs": 3, 00:17:18.514 "num_base_bdevs_discovered": 0, 00:17:18.514 "num_base_bdevs_operational": 3, 00:17:18.514 "base_bdevs_list": [ 00:17:18.514 { 00:17:18.514 "name": "BaseBdev1", 00:17:18.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.514 "is_configured": false, 00:17:18.514 "data_offset": 0, 00:17:18.514 "data_size": 0 00:17:18.514 }, 00:17:18.514 { 00:17:18.514 "name": "BaseBdev2", 00:17:18.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.514 "is_configured": false, 00:17:18.514 "data_offset": 0, 00:17:18.514 "data_size": 0 00:17:18.514 }, 00:17:18.515 { 00:17:18.515 "name": "BaseBdev3", 00:17:18.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.515 "is_configured": false, 00:17:18.515 "data_offset": 0, 00:17:18.515 "data_size": 0 00:17:18.515 } 00:17:18.515 ] 00:17:18.515 }' 00:17:18.515 07:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.515 07:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.082 07:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:19.341 [2024-07-25 07:22:51.783105] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:19.341 [2024-07-25 07:22:51.783132] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f2ec0 name Existed_Raid, state configuring 00:17:19.341 07:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:19.600 [2024-07-25 07:22:52.011709] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:19.600 [2024-07-25 07:22:52.011736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:19.600 [2024-07-25 07:22:52.011745] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:19.600 [2024-07-25 07:22:52.011756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:19.600 [2024-07-25 07:22:52.011764] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:19.600 [2024-07-25 07:22:52.011774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:19.600 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:19.859 [2024-07-25 07:22:52.245866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:19.859 BaseBdev1 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:19.859 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.118 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:20.376 [ 00:17:20.376 { 00:17:20.376 "name": "BaseBdev1", 00:17:20.376 "aliases": [ 00:17:20.376 "6dec36bf-5103-4cfe-891a-e08e8dc4cf08" 00:17:20.376 ], 00:17:20.376 "product_name": "Malloc disk", 00:17:20.376 "block_size": 512, 00:17:20.376 "num_blocks": 65536, 00:17:20.376 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:20.376 "assigned_rate_limits": { 00:17:20.376 "rw_ios_per_sec": 0, 00:17:20.376 "rw_mbytes_per_sec": 0, 00:17:20.376 "r_mbytes_per_sec": 0, 00:17:20.376 "w_mbytes_per_sec": 0 00:17:20.376 }, 00:17:20.376 "claimed": true, 00:17:20.376 "claim_type": "exclusive_write", 00:17:20.376 "zoned": false, 00:17:20.376 "supported_io_types": { 00:17:20.376 "read": true, 00:17:20.376 "write": true, 00:17:20.376 "unmap": true, 00:17:20.376 "flush": true, 00:17:20.376 "reset": true, 00:17:20.376 "nvme_admin": false, 00:17:20.376 "nvme_io": false, 00:17:20.376 "nvme_io_md": false, 00:17:20.376 "write_zeroes": true, 00:17:20.376 "zcopy": true, 00:17:20.376 "get_zone_info": false, 00:17:20.376 "zone_management": false, 00:17:20.376 "zone_append": false, 00:17:20.376 "compare": false, 00:17:20.376 "compare_and_write": false, 00:17:20.376 "abort": true, 00:17:20.376 "seek_hole": false, 00:17:20.376 "seek_data": false, 00:17:20.376 "copy": true, 00:17:20.376 "nvme_iov_md": false 00:17:20.376 }, 00:17:20.376 "memory_domains": [ 00:17:20.376 { 00:17:20.376 "dma_device_id": "system", 00:17:20.376 "dma_device_type": 1 00:17:20.376 }, 00:17:20.376 { 00:17:20.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.376 "dma_device_type": 2 00:17:20.376 } 00:17:20.376 ], 00:17:20.376 "driver_specific": {} 00:17:20.376 } 00:17:20.376 ] 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.376 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.634 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.634 "name": "Existed_Raid", 00:17:20.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.634 "strip_size_kb": 0, 00:17:20.634 "state": "configuring", 00:17:20.634 "raid_level": "raid1", 00:17:20.634 "superblock": false, 00:17:20.634 "num_base_bdevs": 3, 00:17:20.634 "num_base_bdevs_discovered": 1, 00:17:20.634 "num_base_bdevs_operational": 3, 00:17:20.635 "base_bdevs_list": [ 00:17:20.635 { 00:17:20.635 "name": "BaseBdev1", 00:17:20.635 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:20.635 "is_configured": true, 00:17:20.635 "data_offset": 0, 00:17:20.635 "data_size": 65536 00:17:20.635 }, 00:17:20.635 { 00:17:20.635 "name": "BaseBdev2", 00:17:20.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.635 "is_configured": false, 00:17:20.635 "data_offset": 0, 00:17:20.635 "data_size": 0 00:17:20.635 }, 00:17:20.635 { 00:17:20.635 "name": "BaseBdev3", 00:17:20.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.635 "is_configured": false, 00:17:20.635 "data_offset": 0, 00:17:20.635 "data_size": 0 00:17:20.635 } 00:17:20.635 ] 00:17:20.635 }' 00:17:20.635 07:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.635 07:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.201 07:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:21.460 [2024-07-25 07:22:53.781905] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:21.460 [2024-07-25 07:22:53.781943] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f2790 name Existed_Raid, state configuring 00:17:21.460 07:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:21.719 [2024-07-25 07:22:54.010536] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:21.719 [2024-07-25 07:22:54.011906] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:21.720 [2024-07-25 07:22:54.011938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:21.720 [2024-07-25 07:22:54.011948] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:21.720 [2024-07-25 07:22:54.011958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.720 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.979 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.979 "name": "Existed_Raid", 00:17:21.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.979 "strip_size_kb": 0, 00:17:21.979 "state": "configuring", 00:17:21.979 "raid_level": "raid1", 00:17:21.979 "superblock": false, 00:17:21.979 "num_base_bdevs": 3, 00:17:21.979 "num_base_bdevs_discovered": 1, 00:17:21.979 "num_base_bdevs_operational": 3, 00:17:21.979 "base_bdevs_list": [ 00:17:21.979 { 00:17:21.979 "name": "BaseBdev1", 00:17:21.979 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:21.979 "is_configured": true, 00:17:21.979 "data_offset": 0, 00:17:21.979 "data_size": 65536 00:17:21.979 }, 00:17:21.979 { 00:17:21.979 "name": "BaseBdev2", 00:17:21.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.979 "is_configured": false, 00:17:21.979 "data_offset": 0, 00:17:21.979 "data_size": 0 00:17:21.979 }, 00:17:21.979 { 00:17:21.979 "name": "BaseBdev3", 00:17:21.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.979 "is_configured": false, 00:17:21.979 "data_offset": 0, 00:17:21.979 "data_size": 0 00:17:21.979 } 00:17:21.979 ] 00:17:21.979 }' 00:17:21.979 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.979 07:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.546 07:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:22.546 [2024-07-25 07:22:55.028388] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:22.546 BaseBdev2 00:17:22.546 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:22.546 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:22.546 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:22.547 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:22.547 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:22.547 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:22.547 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.804 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:23.063 [ 00:17:23.063 { 00:17:23.063 "name": "BaseBdev2", 00:17:23.063 "aliases": [ 00:17:23.063 "60a48ea0-703a-4b35-9bc5-69044e1448ed" 00:17:23.063 ], 00:17:23.063 "product_name": "Malloc disk", 00:17:23.063 "block_size": 512, 00:17:23.063 "num_blocks": 65536, 00:17:23.063 "uuid": "60a48ea0-703a-4b35-9bc5-69044e1448ed", 00:17:23.063 "assigned_rate_limits": { 00:17:23.063 "rw_ios_per_sec": 0, 00:17:23.063 "rw_mbytes_per_sec": 0, 00:17:23.063 "r_mbytes_per_sec": 0, 00:17:23.063 "w_mbytes_per_sec": 0 00:17:23.063 }, 00:17:23.063 "claimed": true, 00:17:23.063 "claim_type": "exclusive_write", 00:17:23.063 "zoned": false, 00:17:23.063 "supported_io_types": { 00:17:23.063 "read": true, 00:17:23.063 "write": true, 00:17:23.063 "unmap": true, 00:17:23.063 "flush": true, 00:17:23.063 "reset": true, 00:17:23.063 "nvme_admin": false, 00:17:23.063 "nvme_io": false, 00:17:23.063 "nvme_io_md": false, 00:17:23.063 "write_zeroes": true, 00:17:23.063 "zcopy": true, 00:17:23.063 "get_zone_info": false, 00:17:23.063 "zone_management": false, 00:17:23.063 "zone_append": false, 00:17:23.063 "compare": false, 00:17:23.063 "compare_and_write": false, 00:17:23.063 "abort": true, 00:17:23.063 "seek_hole": false, 00:17:23.063 "seek_data": false, 00:17:23.063 "copy": true, 00:17:23.063 "nvme_iov_md": false 00:17:23.063 }, 00:17:23.063 "memory_domains": [ 00:17:23.063 { 00:17:23.063 "dma_device_id": "system", 00:17:23.063 "dma_device_type": 1 00:17:23.063 }, 00:17:23.063 { 00:17:23.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.063 "dma_device_type": 2 00:17:23.063 } 00:17:23.063 ], 00:17:23.063 "driver_specific": {} 00:17:23.063 } 00:17:23.063 ] 00:17:23.063 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:23.063 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:23.063 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:23.063 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:23.063 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.064 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.322 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.322 "name": "Existed_Raid", 00:17:23.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.322 "strip_size_kb": 0, 00:17:23.322 "state": "configuring", 00:17:23.322 "raid_level": "raid1", 00:17:23.322 "superblock": false, 00:17:23.322 "num_base_bdevs": 3, 00:17:23.322 "num_base_bdevs_discovered": 2, 00:17:23.322 "num_base_bdevs_operational": 3, 00:17:23.322 "base_bdevs_list": [ 00:17:23.322 { 00:17:23.322 "name": "BaseBdev1", 00:17:23.322 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:23.322 "is_configured": true, 00:17:23.322 "data_offset": 0, 00:17:23.322 "data_size": 65536 00:17:23.322 }, 00:17:23.322 { 00:17:23.322 "name": "BaseBdev2", 00:17:23.322 "uuid": "60a48ea0-703a-4b35-9bc5-69044e1448ed", 00:17:23.322 "is_configured": true, 00:17:23.322 "data_offset": 0, 00:17:23.322 "data_size": 65536 00:17:23.322 }, 00:17:23.322 { 00:17:23.322 "name": "BaseBdev3", 00:17:23.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.322 "is_configured": false, 00:17:23.322 "data_offset": 0, 00:17:23.322 "data_size": 0 00:17:23.322 } 00:17:23.322 ] 00:17:23.322 }' 00:17:23.322 07:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.322 07:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.889 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:24.148 [2024-07-25 07:22:56.503466] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:24.148 [2024-07-25 07:22:56.503505] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x8f3680 00:17:24.148 [2024-07-25 07:22:56.503512] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:24.148 [2024-07-25 07:22:56.503690] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8f50b0 00:17:24.148 [2024-07-25 07:22:56.503814] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8f3680 00:17:24.148 [2024-07-25 07:22:56.503823] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8f3680 00:17:24.148 [2024-07-25 07:22:56.503971] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:24.148 BaseBdev3 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:24.148 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:24.406 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:24.665 [ 00:17:24.665 { 00:17:24.665 "name": "BaseBdev3", 00:17:24.665 "aliases": [ 00:17:24.665 "6b9b8245-abd9-49e8-a38a-1df85d97d965" 00:17:24.665 ], 00:17:24.665 "product_name": "Malloc disk", 00:17:24.665 "block_size": 512, 00:17:24.665 "num_blocks": 65536, 00:17:24.665 "uuid": "6b9b8245-abd9-49e8-a38a-1df85d97d965", 00:17:24.665 "assigned_rate_limits": { 00:17:24.665 "rw_ios_per_sec": 0, 00:17:24.665 "rw_mbytes_per_sec": 0, 00:17:24.665 "r_mbytes_per_sec": 0, 00:17:24.665 "w_mbytes_per_sec": 0 00:17:24.665 }, 00:17:24.665 "claimed": true, 00:17:24.665 "claim_type": "exclusive_write", 00:17:24.665 "zoned": false, 00:17:24.665 "supported_io_types": { 00:17:24.665 "read": true, 00:17:24.665 "write": true, 00:17:24.665 "unmap": true, 00:17:24.665 "flush": true, 00:17:24.665 "reset": true, 00:17:24.665 "nvme_admin": false, 00:17:24.665 "nvme_io": false, 00:17:24.665 "nvme_io_md": false, 00:17:24.665 "write_zeroes": true, 00:17:24.665 "zcopy": true, 00:17:24.665 "get_zone_info": false, 00:17:24.665 "zone_management": false, 00:17:24.665 "zone_append": false, 00:17:24.665 "compare": false, 00:17:24.665 "compare_and_write": false, 00:17:24.665 "abort": true, 00:17:24.665 "seek_hole": false, 00:17:24.665 "seek_data": false, 00:17:24.665 "copy": true, 00:17:24.665 "nvme_iov_md": false 00:17:24.665 }, 00:17:24.665 "memory_domains": [ 00:17:24.665 { 00:17:24.665 "dma_device_id": "system", 00:17:24.665 "dma_device_type": 1 00:17:24.665 }, 00:17:24.665 { 00:17:24.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.665 "dma_device_type": 2 00:17:24.665 } 00:17:24.665 ], 00:17:24.665 "driver_specific": {} 00:17:24.665 } 00:17:24.665 ] 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.665 07:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.665 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.665 "name": "Existed_Raid", 00:17:24.665 "uuid": "4510765b-0b3a-432f-bb84-8c9e59eeb3e6", 00:17:24.665 "strip_size_kb": 0, 00:17:24.665 "state": "online", 00:17:24.665 "raid_level": "raid1", 00:17:24.665 "superblock": false, 00:17:24.665 "num_base_bdevs": 3, 00:17:24.665 "num_base_bdevs_discovered": 3, 00:17:24.665 "num_base_bdevs_operational": 3, 00:17:24.665 "base_bdevs_list": [ 00:17:24.665 { 00:17:24.665 "name": "BaseBdev1", 00:17:24.665 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:24.665 "is_configured": true, 00:17:24.665 "data_offset": 0, 00:17:24.665 "data_size": 65536 00:17:24.665 }, 00:17:24.665 { 00:17:24.665 "name": "BaseBdev2", 00:17:24.665 "uuid": "60a48ea0-703a-4b35-9bc5-69044e1448ed", 00:17:24.665 "is_configured": true, 00:17:24.665 "data_offset": 0, 00:17:24.665 "data_size": 65536 00:17:24.665 }, 00:17:24.665 { 00:17:24.665 "name": "BaseBdev3", 00:17:24.665 "uuid": "6b9b8245-abd9-49e8-a38a-1df85d97d965", 00:17:24.665 "is_configured": true, 00:17:24.665 "data_offset": 0, 00:17:24.665 "data_size": 65536 00:17:24.665 } 00:17:24.665 ] 00:17:24.665 }' 00:17:24.665 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.665 07:22:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:25.233 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:25.492 [2024-07-25 07:22:57.915670] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:25.492 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:25.492 "name": "Existed_Raid", 00:17:25.492 "aliases": [ 00:17:25.492 "4510765b-0b3a-432f-bb84-8c9e59eeb3e6" 00:17:25.492 ], 00:17:25.492 "product_name": "Raid Volume", 00:17:25.492 "block_size": 512, 00:17:25.492 "num_blocks": 65536, 00:17:25.492 "uuid": "4510765b-0b3a-432f-bb84-8c9e59eeb3e6", 00:17:25.492 "assigned_rate_limits": { 00:17:25.492 "rw_ios_per_sec": 0, 00:17:25.492 "rw_mbytes_per_sec": 0, 00:17:25.492 "r_mbytes_per_sec": 0, 00:17:25.492 "w_mbytes_per_sec": 0 00:17:25.492 }, 00:17:25.492 "claimed": false, 00:17:25.492 "zoned": false, 00:17:25.492 "supported_io_types": { 00:17:25.492 "read": true, 00:17:25.492 "write": true, 00:17:25.492 "unmap": false, 00:17:25.492 "flush": false, 00:17:25.492 "reset": true, 00:17:25.492 "nvme_admin": false, 00:17:25.492 "nvme_io": false, 00:17:25.492 "nvme_io_md": false, 00:17:25.492 "write_zeroes": true, 00:17:25.492 "zcopy": false, 00:17:25.492 "get_zone_info": false, 00:17:25.492 "zone_management": false, 00:17:25.492 "zone_append": false, 00:17:25.492 "compare": false, 00:17:25.492 "compare_and_write": false, 00:17:25.492 "abort": false, 00:17:25.492 "seek_hole": false, 00:17:25.492 "seek_data": false, 00:17:25.492 "copy": false, 00:17:25.492 "nvme_iov_md": false 00:17:25.492 }, 00:17:25.492 "memory_domains": [ 00:17:25.492 { 00:17:25.492 "dma_device_id": "system", 00:17:25.492 "dma_device_type": 1 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.492 "dma_device_type": 2 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "dma_device_id": "system", 00:17:25.492 "dma_device_type": 1 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.492 "dma_device_type": 2 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "dma_device_id": "system", 00:17:25.492 "dma_device_type": 1 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.492 "dma_device_type": 2 00:17:25.492 } 00:17:25.492 ], 00:17:25.492 "driver_specific": { 00:17:25.492 "raid": { 00:17:25.492 "uuid": "4510765b-0b3a-432f-bb84-8c9e59eeb3e6", 00:17:25.492 "strip_size_kb": 0, 00:17:25.492 "state": "online", 00:17:25.492 "raid_level": "raid1", 00:17:25.492 "superblock": false, 00:17:25.492 "num_base_bdevs": 3, 00:17:25.492 "num_base_bdevs_discovered": 3, 00:17:25.492 "num_base_bdevs_operational": 3, 00:17:25.492 "base_bdevs_list": [ 00:17:25.492 { 00:17:25.492 "name": "BaseBdev1", 00:17:25.492 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:25.492 "is_configured": true, 00:17:25.492 "data_offset": 0, 00:17:25.492 "data_size": 65536 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "name": "BaseBdev2", 00:17:25.492 "uuid": "60a48ea0-703a-4b35-9bc5-69044e1448ed", 00:17:25.492 "is_configured": true, 00:17:25.492 "data_offset": 0, 00:17:25.492 "data_size": 65536 00:17:25.492 }, 00:17:25.492 { 00:17:25.492 "name": "BaseBdev3", 00:17:25.492 "uuid": "6b9b8245-abd9-49e8-a38a-1df85d97d965", 00:17:25.492 "is_configured": true, 00:17:25.492 "data_offset": 0, 00:17:25.492 "data_size": 65536 00:17:25.492 } 00:17:25.492 ] 00:17:25.492 } 00:17:25.492 } 00:17:25.492 }' 00:17:25.492 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:25.492 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:25.492 BaseBdev2 00:17:25.492 BaseBdev3' 00:17:25.493 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:25.493 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:25.493 07:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.752 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.752 "name": "BaseBdev1", 00:17:25.752 "aliases": [ 00:17:25.752 "6dec36bf-5103-4cfe-891a-e08e8dc4cf08" 00:17:25.752 ], 00:17:25.752 "product_name": "Malloc disk", 00:17:25.752 "block_size": 512, 00:17:25.752 "num_blocks": 65536, 00:17:25.752 "uuid": "6dec36bf-5103-4cfe-891a-e08e8dc4cf08", 00:17:25.752 "assigned_rate_limits": { 00:17:25.752 "rw_ios_per_sec": 0, 00:17:25.752 "rw_mbytes_per_sec": 0, 00:17:25.752 "r_mbytes_per_sec": 0, 00:17:25.752 "w_mbytes_per_sec": 0 00:17:25.752 }, 00:17:25.752 "claimed": true, 00:17:25.752 "claim_type": "exclusive_write", 00:17:25.752 "zoned": false, 00:17:25.752 "supported_io_types": { 00:17:25.752 "read": true, 00:17:25.752 "write": true, 00:17:25.752 "unmap": true, 00:17:25.752 "flush": true, 00:17:25.752 "reset": true, 00:17:25.752 "nvme_admin": false, 00:17:25.752 "nvme_io": false, 00:17:25.752 "nvme_io_md": false, 00:17:25.752 "write_zeroes": true, 00:17:25.752 "zcopy": true, 00:17:25.752 "get_zone_info": false, 00:17:25.752 "zone_management": false, 00:17:25.752 "zone_append": false, 00:17:25.752 "compare": false, 00:17:25.752 "compare_and_write": false, 00:17:25.752 "abort": true, 00:17:25.752 "seek_hole": false, 00:17:25.752 "seek_data": false, 00:17:25.752 "copy": true, 00:17:25.752 "nvme_iov_md": false 00:17:25.752 }, 00:17:25.752 "memory_domains": [ 00:17:25.752 { 00:17:25.752 "dma_device_id": "system", 00:17:25.752 "dma_device_type": 1 00:17:25.752 }, 00:17:25.752 { 00:17:25.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.752 "dma_device_type": 2 00:17:25.752 } 00:17:25.752 ], 00:17:25.752 "driver_specific": {} 00:17:25.752 }' 00:17:25.752 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.752 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.011 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.271 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.271 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:26.271 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:26.271 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:26.271 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:26.271 "name": "BaseBdev2", 00:17:26.271 "aliases": [ 00:17:26.271 "60a48ea0-703a-4b35-9bc5-69044e1448ed" 00:17:26.271 ], 00:17:26.271 "product_name": "Malloc disk", 00:17:26.271 "block_size": 512, 00:17:26.271 "num_blocks": 65536, 00:17:26.271 "uuid": "60a48ea0-703a-4b35-9bc5-69044e1448ed", 00:17:26.271 "assigned_rate_limits": { 00:17:26.271 "rw_ios_per_sec": 0, 00:17:26.271 "rw_mbytes_per_sec": 0, 00:17:26.271 "r_mbytes_per_sec": 0, 00:17:26.271 "w_mbytes_per_sec": 0 00:17:26.271 }, 00:17:26.271 "claimed": true, 00:17:26.271 "claim_type": "exclusive_write", 00:17:26.271 "zoned": false, 00:17:26.271 "supported_io_types": { 00:17:26.271 "read": true, 00:17:26.271 "write": true, 00:17:26.271 "unmap": true, 00:17:26.271 "flush": true, 00:17:26.271 "reset": true, 00:17:26.271 "nvme_admin": false, 00:17:26.271 "nvme_io": false, 00:17:26.271 "nvme_io_md": false, 00:17:26.271 "write_zeroes": true, 00:17:26.271 "zcopy": true, 00:17:26.271 "get_zone_info": false, 00:17:26.271 "zone_management": false, 00:17:26.271 "zone_append": false, 00:17:26.271 "compare": false, 00:17:26.271 "compare_and_write": false, 00:17:26.271 "abort": true, 00:17:26.271 "seek_hole": false, 00:17:26.271 "seek_data": false, 00:17:26.271 "copy": true, 00:17:26.271 "nvme_iov_md": false 00:17:26.271 }, 00:17:26.271 "memory_domains": [ 00:17:26.271 { 00:17:26.271 "dma_device_id": "system", 00:17:26.271 "dma_device_type": 1 00:17:26.271 }, 00:17:26.271 { 00:17:26.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.271 "dma_device_type": 2 00:17:26.271 } 00:17:26.271 ], 00:17:26.271 "driver_specific": {} 00:17:26.271 }' 00:17:26.271 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.530 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.530 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:26.530 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.530 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.530 07:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.530 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.530 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:26.789 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:27.048 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:27.048 "name": "BaseBdev3", 00:17:27.049 "aliases": [ 00:17:27.049 "6b9b8245-abd9-49e8-a38a-1df85d97d965" 00:17:27.049 ], 00:17:27.049 "product_name": "Malloc disk", 00:17:27.049 "block_size": 512, 00:17:27.049 "num_blocks": 65536, 00:17:27.049 "uuid": "6b9b8245-abd9-49e8-a38a-1df85d97d965", 00:17:27.049 "assigned_rate_limits": { 00:17:27.049 "rw_ios_per_sec": 0, 00:17:27.049 "rw_mbytes_per_sec": 0, 00:17:27.049 "r_mbytes_per_sec": 0, 00:17:27.049 "w_mbytes_per_sec": 0 00:17:27.049 }, 00:17:27.049 "claimed": true, 00:17:27.049 "claim_type": "exclusive_write", 00:17:27.049 "zoned": false, 00:17:27.049 "supported_io_types": { 00:17:27.049 "read": true, 00:17:27.049 "write": true, 00:17:27.049 "unmap": true, 00:17:27.049 "flush": true, 00:17:27.049 "reset": true, 00:17:27.049 "nvme_admin": false, 00:17:27.049 "nvme_io": false, 00:17:27.049 "nvme_io_md": false, 00:17:27.049 "write_zeroes": true, 00:17:27.049 "zcopy": true, 00:17:27.049 "get_zone_info": false, 00:17:27.049 "zone_management": false, 00:17:27.049 "zone_append": false, 00:17:27.049 "compare": false, 00:17:27.049 "compare_and_write": false, 00:17:27.049 "abort": true, 00:17:27.049 "seek_hole": false, 00:17:27.049 "seek_data": false, 00:17:27.049 "copy": true, 00:17:27.049 "nvme_iov_md": false 00:17:27.049 }, 00:17:27.049 "memory_domains": [ 00:17:27.049 { 00:17:27.049 "dma_device_id": "system", 00:17:27.049 "dma_device_type": 1 00:17:27.049 }, 00:17:27.049 { 00:17:27.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.049 "dma_device_type": 2 00:17:27.049 } 00:17:27.049 ], 00:17:27.049 "driver_specific": {} 00:17:27.049 }' 00:17:27.049 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.049 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.049 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:27.049 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:27.308 07:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:27.568 [2024-07-25 07:23:00.033088] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.568 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.827 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.827 "name": "Existed_Raid", 00:17:27.827 "uuid": "4510765b-0b3a-432f-bb84-8c9e59eeb3e6", 00:17:27.827 "strip_size_kb": 0, 00:17:27.827 "state": "online", 00:17:27.827 "raid_level": "raid1", 00:17:27.827 "superblock": false, 00:17:27.827 "num_base_bdevs": 3, 00:17:27.827 "num_base_bdevs_discovered": 2, 00:17:27.827 "num_base_bdevs_operational": 2, 00:17:27.827 "base_bdevs_list": [ 00:17:27.827 { 00:17:27.827 "name": null, 00:17:27.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.827 "is_configured": false, 00:17:27.827 "data_offset": 0, 00:17:27.827 "data_size": 65536 00:17:27.827 }, 00:17:27.827 { 00:17:27.827 "name": "BaseBdev2", 00:17:27.827 "uuid": "60a48ea0-703a-4b35-9bc5-69044e1448ed", 00:17:27.827 "is_configured": true, 00:17:27.827 "data_offset": 0, 00:17:27.827 "data_size": 65536 00:17:27.827 }, 00:17:27.827 { 00:17:27.827 "name": "BaseBdev3", 00:17:27.827 "uuid": "6b9b8245-abd9-49e8-a38a-1df85d97d965", 00:17:27.827 "is_configured": true, 00:17:27.827 "data_offset": 0, 00:17:27.827 "data_size": 65536 00:17:27.827 } 00:17:27.827 ] 00:17:27.827 }' 00:17:27.827 07:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.827 07:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.765 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:28.765 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:28.765 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.765 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:29.024 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:29.024 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:29.024 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:29.024 [2024-07-25 07:23:01.554028] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:29.283 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:29.542 [2024-07-25 07:23:01.949181] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:29.542 [2024-07-25 07:23:01.949257] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:29.542 [2024-07-25 07:23:01.959377] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:29.542 [2024-07-25 07:23:01.959406] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:29.542 [2024-07-25 07:23:01.959416] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f3680 name Existed_Raid, state offline 00:17:29.542 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:29.542 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:29.542 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.542 07:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:29.801 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:29.801 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:29.801 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:29.801 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:29.801 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:29.801 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:30.059 BaseBdev2 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:30.059 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:30.318 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:30.318 [ 00:17:30.318 { 00:17:30.318 "name": "BaseBdev2", 00:17:30.318 "aliases": [ 00:17:30.318 "30529713-25fd-4835-888f-c8cf3f288a44" 00:17:30.318 ], 00:17:30.318 "product_name": "Malloc disk", 00:17:30.318 "block_size": 512, 00:17:30.318 "num_blocks": 65536, 00:17:30.318 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:30.318 "assigned_rate_limits": { 00:17:30.318 "rw_ios_per_sec": 0, 00:17:30.318 "rw_mbytes_per_sec": 0, 00:17:30.318 "r_mbytes_per_sec": 0, 00:17:30.318 "w_mbytes_per_sec": 0 00:17:30.318 }, 00:17:30.318 "claimed": false, 00:17:30.318 "zoned": false, 00:17:30.318 "supported_io_types": { 00:17:30.318 "read": true, 00:17:30.318 "write": true, 00:17:30.318 "unmap": true, 00:17:30.318 "flush": true, 00:17:30.318 "reset": true, 00:17:30.318 "nvme_admin": false, 00:17:30.318 "nvme_io": false, 00:17:30.318 "nvme_io_md": false, 00:17:30.318 "write_zeroes": true, 00:17:30.318 "zcopy": true, 00:17:30.318 "get_zone_info": false, 00:17:30.318 "zone_management": false, 00:17:30.318 "zone_append": false, 00:17:30.318 "compare": false, 00:17:30.318 "compare_and_write": false, 00:17:30.318 "abort": true, 00:17:30.318 "seek_hole": false, 00:17:30.318 "seek_data": false, 00:17:30.318 "copy": true, 00:17:30.318 "nvme_iov_md": false 00:17:30.318 }, 00:17:30.318 "memory_domains": [ 00:17:30.318 { 00:17:30.318 "dma_device_id": "system", 00:17:30.318 "dma_device_type": 1 00:17:30.318 }, 00:17:30.318 { 00:17:30.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.318 "dma_device_type": 2 00:17:30.318 } 00:17:30.318 ], 00:17:30.318 "driver_specific": {} 00:17:30.318 } 00:17:30.318 ] 00:17:30.319 07:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:30.319 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:30.319 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:30.319 07:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:30.577 BaseBdev3 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:30.577 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:30.863 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:31.131 [ 00:17:31.131 { 00:17:31.131 "name": "BaseBdev3", 00:17:31.131 "aliases": [ 00:17:31.131 "faa3662c-a313-4771-9af4-15c740dcae8e" 00:17:31.131 ], 00:17:31.131 "product_name": "Malloc disk", 00:17:31.131 "block_size": 512, 00:17:31.131 "num_blocks": 65536, 00:17:31.131 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:31.131 "assigned_rate_limits": { 00:17:31.131 "rw_ios_per_sec": 0, 00:17:31.131 "rw_mbytes_per_sec": 0, 00:17:31.131 "r_mbytes_per_sec": 0, 00:17:31.131 "w_mbytes_per_sec": 0 00:17:31.131 }, 00:17:31.131 "claimed": false, 00:17:31.131 "zoned": false, 00:17:31.131 "supported_io_types": { 00:17:31.131 "read": true, 00:17:31.131 "write": true, 00:17:31.131 "unmap": true, 00:17:31.131 "flush": true, 00:17:31.131 "reset": true, 00:17:31.131 "nvme_admin": false, 00:17:31.131 "nvme_io": false, 00:17:31.131 "nvme_io_md": false, 00:17:31.131 "write_zeroes": true, 00:17:31.131 "zcopy": true, 00:17:31.131 "get_zone_info": false, 00:17:31.131 "zone_management": false, 00:17:31.131 "zone_append": false, 00:17:31.131 "compare": false, 00:17:31.131 "compare_and_write": false, 00:17:31.131 "abort": true, 00:17:31.131 "seek_hole": false, 00:17:31.131 "seek_data": false, 00:17:31.131 "copy": true, 00:17:31.131 "nvme_iov_md": false 00:17:31.131 }, 00:17:31.131 "memory_domains": [ 00:17:31.131 { 00:17:31.131 "dma_device_id": "system", 00:17:31.131 "dma_device_type": 1 00:17:31.131 }, 00:17:31.131 { 00:17:31.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.131 "dma_device_type": 2 00:17:31.131 } 00:17:31.131 ], 00:17:31.131 "driver_specific": {} 00:17:31.131 } 00:17:31.131 ] 00:17:31.131 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:31.131 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:31.131 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:31.131 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:31.131 [2024-07-25 07:23:03.656111] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:31.131 [2024-07-25 07:23:03.656157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:31.131 [2024-07-25 07:23:03.656175] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:31.131 [2024-07-25 07:23:03.657336] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:31.390 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.391 "name": "Existed_Raid", 00:17:31.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.391 "strip_size_kb": 0, 00:17:31.391 "state": "configuring", 00:17:31.391 "raid_level": "raid1", 00:17:31.391 "superblock": false, 00:17:31.391 "num_base_bdevs": 3, 00:17:31.391 "num_base_bdevs_discovered": 2, 00:17:31.391 "num_base_bdevs_operational": 3, 00:17:31.391 "base_bdevs_list": [ 00:17:31.391 { 00:17:31.391 "name": "BaseBdev1", 00:17:31.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.391 "is_configured": false, 00:17:31.391 "data_offset": 0, 00:17:31.391 "data_size": 0 00:17:31.391 }, 00:17:31.391 { 00:17:31.391 "name": "BaseBdev2", 00:17:31.391 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:31.391 "is_configured": true, 00:17:31.391 "data_offset": 0, 00:17:31.391 "data_size": 65536 00:17:31.391 }, 00:17:31.391 { 00:17:31.391 "name": "BaseBdev3", 00:17:31.391 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:31.391 "is_configured": true, 00:17:31.391 "data_offset": 0, 00:17:31.391 "data_size": 65536 00:17:31.391 } 00:17:31.391 ] 00:17:31.391 }' 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.391 07:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.958 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:32.217 [2024-07-25 07:23:04.582538] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.217 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.476 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.476 "name": "Existed_Raid", 00:17:32.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.476 "strip_size_kb": 0, 00:17:32.476 "state": "configuring", 00:17:32.476 "raid_level": "raid1", 00:17:32.476 "superblock": false, 00:17:32.476 "num_base_bdevs": 3, 00:17:32.476 "num_base_bdevs_discovered": 1, 00:17:32.476 "num_base_bdevs_operational": 3, 00:17:32.476 "base_bdevs_list": [ 00:17:32.476 { 00:17:32.476 "name": "BaseBdev1", 00:17:32.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.476 "is_configured": false, 00:17:32.476 "data_offset": 0, 00:17:32.476 "data_size": 0 00:17:32.476 }, 00:17:32.476 { 00:17:32.476 "name": null, 00:17:32.476 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:32.476 "is_configured": false, 00:17:32.476 "data_offset": 0, 00:17:32.476 "data_size": 65536 00:17:32.476 }, 00:17:32.476 { 00:17:32.476 "name": "BaseBdev3", 00:17:32.476 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:32.476 "is_configured": true, 00:17:32.476 "data_offset": 0, 00:17:32.476 "data_size": 65536 00:17:32.476 } 00:17:32.476 ] 00:17:32.476 }' 00:17:32.476 07:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.476 07:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.045 07:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.045 07:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:33.304 07:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:33.304 07:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:33.563 [2024-07-25 07:23:05.865112] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:33.563 BaseBdev1 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:33.563 07:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:33.823 [ 00:17:33.823 { 00:17:33.823 "name": "BaseBdev1", 00:17:33.823 "aliases": [ 00:17:33.823 "5f949a9f-6a47-4b33-9e22-c46275b01e34" 00:17:33.823 ], 00:17:33.823 "product_name": "Malloc disk", 00:17:33.823 "block_size": 512, 00:17:33.823 "num_blocks": 65536, 00:17:33.823 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:33.823 "assigned_rate_limits": { 00:17:33.823 "rw_ios_per_sec": 0, 00:17:33.823 "rw_mbytes_per_sec": 0, 00:17:33.823 "r_mbytes_per_sec": 0, 00:17:33.823 "w_mbytes_per_sec": 0 00:17:33.823 }, 00:17:33.823 "claimed": true, 00:17:33.823 "claim_type": "exclusive_write", 00:17:33.823 "zoned": false, 00:17:33.823 "supported_io_types": { 00:17:33.823 "read": true, 00:17:33.823 "write": true, 00:17:33.823 "unmap": true, 00:17:33.823 "flush": true, 00:17:33.823 "reset": true, 00:17:33.823 "nvme_admin": false, 00:17:33.823 "nvme_io": false, 00:17:33.823 "nvme_io_md": false, 00:17:33.823 "write_zeroes": true, 00:17:33.823 "zcopy": true, 00:17:33.823 "get_zone_info": false, 00:17:33.823 "zone_management": false, 00:17:33.823 "zone_append": false, 00:17:33.823 "compare": false, 00:17:33.823 "compare_and_write": false, 00:17:33.823 "abort": true, 00:17:33.823 "seek_hole": false, 00:17:33.823 "seek_data": false, 00:17:33.823 "copy": true, 00:17:33.823 "nvme_iov_md": false 00:17:33.823 }, 00:17:33.823 "memory_domains": [ 00:17:33.823 { 00:17:33.823 "dma_device_id": "system", 00:17:33.823 "dma_device_type": 1 00:17:33.823 }, 00:17:33.823 { 00:17:33.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.823 "dma_device_type": 2 00:17:33.823 } 00:17:33.823 ], 00:17:33.823 "driver_specific": {} 00:17:33.823 } 00:17:33.823 ] 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.823 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.082 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.083 "name": "Existed_Raid", 00:17:34.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.083 "strip_size_kb": 0, 00:17:34.083 "state": "configuring", 00:17:34.083 "raid_level": "raid1", 00:17:34.083 "superblock": false, 00:17:34.083 "num_base_bdevs": 3, 00:17:34.083 "num_base_bdevs_discovered": 2, 00:17:34.083 "num_base_bdevs_operational": 3, 00:17:34.083 "base_bdevs_list": [ 00:17:34.083 { 00:17:34.083 "name": "BaseBdev1", 00:17:34.083 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:34.083 "is_configured": true, 00:17:34.083 "data_offset": 0, 00:17:34.083 "data_size": 65536 00:17:34.083 }, 00:17:34.083 { 00:17:34.083 "name": null, 00:17:34.083 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:34.083 "is_configured": false, 00:17:34.083 "data_offset": 0, 00:17:34.083 "data_size": 65536 00:17:34.083 }, 00:17:34.083 { 00:17:34.083 "name": "BaseBdev3", 00:17:34.083 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:34.083 "is_configured": true, 00:17:34.083 "data_offset": 0, 00:17:34.083 "data_size": 65536 00:17:34.083 } 00:17:34.083 ] 00:17:34.083 }' 00:17:34.083 07:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.083 07:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.650 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.650 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:34.909 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:34.910 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:35.169 [2024-07-25 07:23:07.493438] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.169 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.427 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.427 "name": "Existed_Raid", 00:17:35.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.427 "strip_size_kb": 0, 00:17:35.427 "state": "configuring", 00:17:35.428 "raid_level": "raid1", 00:17:35.428 "superblock": false, 00:17:35.428 "num_base_bdevs": 3, 00:17:35.428 "num_base_bdevs_discovered": 1, 00:17:35.428 "num_base_bdevs_operational": 3, 00:17:35.428 "base_bdevs_list": [ 00:17:35.428 { 00:17:35.428 "name": "BaseBdev1", 00:17:35.428 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:35.428 "is_configured": true, 00:17:35.428 "data_offset": 0, 00:17:35.428 "data_size": 65536 00:17:35.428 }, 00:17:35.428 { 00:17:35.428 "name": null, 00:17:35.428 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:35.428 "is_configured": false, 00:17:35.428 "data_offset": 0, 00:17:35.428 "data_size": 65536 00:17:35.428 }, 00:17:35.428 { 00:17:35.428 "name": null, 00:17:35.428 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:35.428 "is_configured": false, 00:17:35.428 "data_offset": 0, 00:17:35.428 "data_size": 65536 00:17:35.428 } 00:17:35.428 ] 00:17:35.428 }' 00:17:35.428 07:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.428 07:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.996 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.996 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:35.996 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:35.996 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:36.255 [2024-07-25 07:23:08.708646] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.255 07:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.823 07:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.823 "name": "Existed_Raid", 00:17:36.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.823 "strip_size_kb": 0, 00:17:36.823 "state": "configuring", 00:17:36.823 "raid_level": "raid1", 00:17:36.823 "superblock": false, 00:17:36.823 "num_base_bdevs": 3, 00:17:36.823 "num_base_bdevs_discovered": 2, 00:17:36.823 "num_base_bdevs_operational": 3, 00:17:36.823 "base_bdevs_list": [ 00:17:36.823 { 00:17:36.823 "name": "BaseBdev1", 00:17:36.823 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:36.823 "is_configured": true, 00:17:36.823 "data_offset": 0, 00:17:36.823 "data_size": 65536 00:17:36.823 }, 00:17:36.823 { 00:17:36.823 "name": null, 00:17:36.823 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:36.823 "is_configured": false, 00:17:36.823 "data_offset": 0, 00:17:36.823 "data_size": 65536 00:17:36.823 }, 00:17:36.823 { 00:17:36.823 "name": "BaseBdev3", 00:17:36.823 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:36.823 "is_configured": true, 00:17:36.823 "data_offset": 0, 00:17:36.823 "data_size": 65536 00:17:36.823 } 00:17:36.823 ] 00:17:36.823 }' 00:17:36.823 07:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.823 07:23:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.391 07:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.391 07:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:37.649 07:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:37.649 07:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:37.908 [2024-07-25 07:23:10.188713] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.908 "name": "Existed_Raid", 00:17:37.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.908 "strip_size_kb": 0, 00:17:37.908 "state": "configuring", 00:17:37.908 "raid_level": "raid1", 00:17:37.908 "superblock": false, 00:17:37.908 "num_base_bdevs": 3, 00:17:37.908 "num_base_bdevs_discovered": 1, 00:17:37.908 "num_base_bdevs_operational": 3, 00:17:37.908 "base_bdevs_list": [ 00:17:37.908 { 00:17:37.908 "name": null, 00:17:37.908 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:37.908 "is_configured": false, 00:17:37.908 "data_offset": 0, 00:17:37.908 "data_size": 65536 00:17:37.908 }, 00:17:37.908 { 00:17:37.908 "name": null, 00:17:37.908 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:37.908 "is_configured": false, 00:17:37.908 "data_offset": 0, 00:17:37.908 "data_size": 65536 00:17:37.908 }, 00:17:37.908 { 00:17:37.908 "name": "BaseBdev3", 00:17:37.908 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:37.908 "is_configured": true, 00:17:37.908 "data_offset": 0, 00:17:37.908 "data_size": 65536 00:17:37.908 } 00:17:37.908 ] 00:17:37.908 }' 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.908 07:23:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.847 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.847 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:39.106 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:39.106 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:39.365 [2024-07-25 07:23:11.645699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.365 07:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.932 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.932 "name": "Existed_Raid", 00:17:39.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.932 "strip_size_kb": 0, 00:17:39.932 "state": "configuring", 00:17:39.932 "raid_level": "raid1", 00:17:39.932 "superblock": false, 00:17:39.932 "num_base_bdevs": 3, 00:17:39.932 "num_base_bdevs_discovered": 2, 00:17:39.932 "num_base_bdevs_operational": 3, 00:17:39.932 "base_bdevs_list": [ 00:17:39.932 { 00:17:39.932 "name": null, 00:17:39.932 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:39.932 "is_configured": false, 00:17:39.932 "data_offset": 0, 00:17:39.932 "data_size": 65536 00:17:39.932 }, 00:17:39.932 { 00:17:39.932 "name": "BaseBdev2", 00:17:39.932 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:39.932 "is_configured": true, 00:17:39.932 "data_offset": 0, 00:17:39.932 "data_size": 65536 00:17:39.932 }, 00:17:39.932 { 00:17:39.932 "name": "BaseBdev3", 00:17:39.932 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:39.932 "is_configured": true, 00:17:39.932 "data_offset": 0, 00:17:39.932 "data_size": 65536 00:17:39.932 } 00:17:39.932 ] 00:17:39.932 }' 00:17:39.932 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.932 07:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.500 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.500 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:40.500 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:40.500 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.500 07:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:40.760 07:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5f949a9f-6a47-4b33-9e22-c46275b01e34 00:17:41.327 [2024-07-25 07:23:13.666209] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:41.327 [2024-07-25 07:23:13.666245] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x8f3a70 00:17:41.327 [2024-07-25 07:23:13.666253] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:41.327 [2024-07-25 07:23:13.666435] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8f2e90 00:17:41.327 [2024-07-25 07:23:13.666555] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8f3a70 00:17:41.327 [2024-07-25 07:23:13.666564] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8f3a70 00:17:41.327 [2024-07-25 07:23:13.666718] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:41.327 NewBaseBdev 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:41.327 07:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:41.895 [ 00:17:41.895 { 00:17:41.895 "name": "NewBaseBdev", 00:17:41.895 "aliases": [ 00:17:41.895 "5f949a9f-6a47-4b33-9e22-c46275b01e34" 00:17:41.895 ], 00:17:41.895 "product_name": "Malloc disk", 00:17:41.895 "block_size": 512, 00:17:41.895 "num_blocks": 65536, 00:17:41.895 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:41.895 "assigned_rate_limits": { 00:17:41.895 "rw_ios_per_sec": 0, 00:17:41.895 "rw_mbytes_per_sec": 0, 00:17:41.895 "r_mbytes_per_sec": 0, 00:17:41.895 "w_mbytes_per_sec": 0 00:17:41.895 }, 00:17:41.895 "claimed": true, 00:17:41.895 "claim_type": "exclusive_write", 00:17:41.895 "zoned": false, 00:17:41.895 "supported_io_types": { 00:17:41.895 "read": true, 00:17:41.895 "write": true, 00:17:41.895 "unmap": true, 00:17:41.895 "flush": true, 00:17:41.895 "reset": true, 00:17:41.895 "nvme_admin": false, 00:17:41.895 "nvme_io": false, 00:17:41.895 "nvme_io_md": false, 00:17:41.895 "write_zeroes": true, 00:17:41.895 "zcopy": true, 00:17:41.895 "get_zone_info": false, 00:17:41.895 "zone_management": false, 00:17:41.895 "zone_append": false, 00:17:41.895 "compare": false, 00:17:41.895 "compare_and_write": false, 00:17:41.895 "abort": true, 00:17:41.895 "seek_hole": false, 00:17:41.895 "seek_data": false, 00:17:41.895 "copy": true, 00:17:41.895 "nvme_iov_md": false 00:17:41.895 }, 00:17:41.895 "memory_domains": [ 00:17:41.895 { 00:17:41.895 "dma_device_id": "system", 00:17:41.895 "dma_device_type": 1 00:17:41.895 }, 00:17:41.895 { 00:17:41.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.895 "dma_device_type": 2 00:17:41.895 } 00:17:41.895 ], 00:17:41.895 "driver_specific": {} 00:17:41.895 } 00:17:41.895 ] 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.895 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.169 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.169 "name": "Existed_Raid", 00:17:42.169 "uuid": "a8609836-d786-403f-93d6-79757748e5f1", 00:17:42.169 "strip_size_kb": 0, 00:17:42.169 "state": "online", 00:17:42.169 "raid_level": "raid1", 00:17:42.169 "superblock": false, 00:17:42.169 "num_base_bdevs": 3, 00:17:42.169 "num_base_bdevs_discovered": 3, 00:17:42.169 "num_base_bdevs_operational": 3, 00:17:42.169 "base_bdevs_list": [ 00:17:42.169 { 00:17:42.169 "name": "NewBaseBdev", 00:17:42.169 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:42.169 "is_configured": true, 00:17:42.169 "data_offset": 0, 00:17:42.169 "data_size": 65536 00:17:42.169 }, 00:17:42.169 { 00:17:42.169 "name": "BaseBdev2", 00:17:42.169 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:42.169 "is_configured": true, 00:17:42.169 "data_offset": 0, 00:17:42.169 "data_size": 65536 00:17:42.169 }, 00:17:42.169 { 00:17:42.169 "name": "BaseBdev3", 00:17:42.169 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:42.169 "is_configured": true, 00:17:42.169 "data_offset": 0, 00:17:42.169 "data_size": 65536 00:17:42.169 } 00:17:42.169 ] 00:17:42.169 }' 00:17:42.169 07:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.169 07:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:42.737 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:42.996 [2024-07-25 07:23:15.407083] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:42.996 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:42.996 "name": "Existed_Raid", 00:17:42.996 "aliases": [ 00:17:42.996 "a8609836-d786-403f-93d6-79757748e5f1" 00:17:42.996 ], 00:17:42.996 "product_name": "Raid Volume", 00:17:42.996 "block_size": 512, 00:17:42.996 "num_blocks": 65536, 00:17:42.996 "uuid": "a8609836-d786-403f-93d6-79757748e5f1", 00:17:42.996 "assigned_rate_limits": { 00:17:42.996 "rw_ios_per_sec": 0, 00:17:42.996 "rw_mbytes_per_sec": 0, 00:17:42.996 "r_mbytes_per_sec": 0, 00:17:42.996 "w_mbytes_per_sec": 0 00:17:42.996 }, 00:17:42.996 "claimed": false, 00:17:42.996 "zoned": false, 00:17:42.996 "supported_io_types": { 00:17:42.996 "read": true, 00:17:42.996 "write": true, 00:17:42.996 "unmap": false, 00:17:42.996 "flush": false, 00:17:42.996 "reset": true, 00:17:42.996 "nvme_admin": false, 00:17:42.996 "nvme_io": false, 00:17:42.996 "nvme_io_md": false, 00:17:42.996 "write_zeroes": true, 00:17:42.996 "zcopy": false, 00:17:42.996 "get_zone_info": false, 00:17:42.996 "zone_management": false, 00:17:42.996 "zone_append": false, 00:17:42.996 "compare": false, 00:17:42.996 "compare_and_write": false, 00:17:42.996 "abort": false, 00:17:42.996 "seek_hole": false, 00:17:42.996 "seek_data": false, 00:17:42.996 "copy": false, 00:17:42.996 "nvme_iov_md": false 00:17:42.996 }, 00:17:42.996 "memory_domains": [ 00:17:42.996 { 00:17:42.996 "dma_device_id": "system", 00:17:42.996 "dma_device_type": 1 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.996 "dma_device_type": 2 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "dma_device_id": "system", 00:17:42.996 "dma_device_type": 1 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.996 "dma_device_type": 2 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "dma_device_id": "system", 00:17:42.996 "dma_device_type": 1 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.996 "dma_device_type": 2 00:17:42.996 } 00:17:42.996 ], 00:17:42.996 "driver_specific": { 00:17:42.996 "raid": { 00:17:42.996 "uuid": "a8609836-d786-403f-93d6-79757748e5f1", 00:17:42.996 "strip_size_kb": 0, 00:17:42.996 "state": "online", 00:17:42.996 "raid_level": "raid1", 00:17:42.996 "superblock": false, 00:17:42.996 "num_base_bdevs": 3, 00:17:42.996 "num_base_bdevs_discovered": 3, 00:17:42.996 "num_base_bdevs_operational": 3, 00:17:42.996 "base_bdevs_list": [ 00:17:42.996 { 00:17:42.996 "name": "NewBaseBdev", 00:17:42.996 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:42.996 "is_configured": true, 00:17:42.996 "data_offset": 0, 00:17:42.996 "data_size": 65536 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "name": "BaseBdev2", 00:17:42.996 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:42.996 "is_configured": true, 00:17:42.996 "data_offset": 0, 00:17:42.996 "data_size": 65536 00:17:42.996 }, 00:17:42.996 { 00:17:42.996 "name": "BaseBdev3", 00:17:42.996 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:42.996 "is_configured": true, 00:17:42.996 "data_offset": 0, 00:17:42.996 "data_size": 65536 00:17:42.996 } 00:17:42.996 ] 00:17:42.996 } 00:17:42.996 } 00:17:42.996 }' 00:17:42.996 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:42.996 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:42.996 BaseBdev2 00:17:42.996 BaseBdev3' 00:17:42.996 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.996 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:42.996 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.256 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.256 "name": "NewBaseBdev", 00:17:43.256 "aliases": [ 00:17:43.256 "5f949a9f-6a47-4b33-9e22-c46275b01e34" 00:17:43.256 ], 00:17:43.256 "product_name": "Malloc disk", 00:17:43.256 "block_size": 512, 00:17:43.256 "num_blocks": 65536, 00:17:43.256 "uuid": "5f949a9f-6a47-4b33-9e22-c46275b01e34", 00:17:43.256 "assigned_rate_limits": { 00:17:43.256 "rw_ios_per_sec": 0, 00:17:43.256 "rw_mbytes_per_sec": 0, 00:17:43.256 "r_mbytes_per_sec": 0, 00:17:43.256 "w_mbytes_per_sec": 0 00:17:43.256 }, 00:17:43.256 "claimed": true, 00:17:43.256 "claim_type": "exclusive_write", 00:17:43.256 "zoned": false, 00:17:43.256 "supported_io_types": { 00:17:43.256 "read": true, 00:17:43.256 "write": true, 00:17:43.256 "unmap": true, 00:17:43.256 "flush": true, 00:17:43.256 "reset": true, 00:17:43.256 "nvme_admin": false, 00:17:43.256 "nvme_io": false, 00:17:43.256 "nvme_io_md": false, 00:17:43.256 "write_zeroes": true, 00:17:43.256 "zcopy": true, 00:17:43.256 "get_zone_info": false, 00:17:43.256 "zone_management": false, 00:17:43.256 "zone_append": false, 00:17:43.256 "compare": false, 00:17:43.256 "compare_and_write": false, 00:17:43.256 "abort": true, 00:17:43.256 "seek_hole": false, 00:17:43.256 "seek_data": false, 00:17:43.256 "copy": true, 00:17:43.256 "nvme_iov_md": false 00:17:43.256 }, 00:17:43.256 "memory_domains": [ 00:17:43.256 { 00:17:43.256 "dma_device_id": "system", 00:17:43.256 "dma_device_type": 1 00:17:43.256 }, 00:17:43.256 { 00:17:43.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.256 "dma_device_type": 2 00:17:43.256 } 00:17:43.256 ], 00:17:43.256 "driver_specific": {} 00:17:43.256 }' 00:17:43.256 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.256 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.256 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.256 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.515 07:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.515 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.515 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.515 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:43.515 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.774 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.774 "name": "BaseBdev2", 00:17:43.774 "aliases": [ 00:17:43.774 "30529713-25fd-4835-888f-c8cf3f288a44" 00:17:43.774 ], 00:17:43.774 "product_name": "Malloc disk", 00:17:43.774 "block_size": 512, 00:17:43.775 "num_blocks": 65536, 00:17:43.775 "uuid": "30529713-25fd-4835-888f-c8cf3f288a44", 00:17:43.775 "assigned_rate_limits": { 00:17:43.775 "rw_ios_per_sec": 0, 00:17:43.775 "rw_mbytes_per_sec": 0, 00:17:43.775 "r_mbytes_per_sec": 0, 00:17:43.775 "w_mbytes_per_sec": 0 00:17:43.775 }, 00:17:43.775 "claimed": true, 00:17:43.775 "claim_type": "exclusive_write", 00:17:43.775 "zoned": false, 00:17:43.775 "supported_io_types": { 00:17:43.775 "read": true, 00:17:43.775 "write": true, 00:17:43.775 "unmap": true, 00:17:43.775 "flush": true, 00:17:43.775 "reset": true, 00:17:43.775 "nvme_admin": false, 00:17:43.775 "nvme_io": false, 00:17:43.775 "nvme_io_md": false, 00:17:43.775 "write_zeroes": true, 00:17:43.775 "zcopy": true, 00:17:43.775 "get_zone_info": false, 00:17:43.775 "zone_management": false, 00:17:43.775 "zone_append": false, 00:17:43.775 "compare": false, 00:17:43.775 "compare_and_write": false, 00:17:43.775 "abort": true, 00:17:43.775 "seek_hole": false, 00:17:43.775 "seek_data": false, 00:17:43.775 "copy": true, 00:17:43.775 "nvme_iov_md": false 00:17:43.775 }, 00:17:43.775 "memory_domains": [ 00:17:43.775 { 00:17:43.775 "dma_device_id": "system", 00:17:43.775 "dma_device_type": 1 00:17:43.775 }, 00:17:43.775 { 00:17:43.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.775 "dma_device_type": 2 00:17:43.775 } 00:17:43.775 ], 00:17:43.775 "driver_specific": {} 00:17:43.775 }' 00:17:43.775 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.775 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.034 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.327 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.328 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.328 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:44.328 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:44.328 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:44.328 "name": "BaseBdev3", 00:17:44.328 "aliases": [ 00:17:44.328 "faa3662c-a313-4771-9af4-15c740dcae8e" 00:17:44.328 ], 00:17:44.328 "product_name": "Malloc disk", 00:17:44.328 "block_size": 512, 00:17:44.328 "num_blocks": 65536, 00:17:44.328 "uuid": "faa3662c-a313-4771-9af4-15c740dcae8e", 00:17:44.328 "assigned_rate_limits": { 00:17:44.328 "rw_ios_per_sec": 0, 00:17:44.328 "rw_mbytes_per_sec": 0, 00:17:44.328 "r_mbytes_per_sec": 0, 00:17:44.328 "w_mbytes_per_sec": 0 00:17:44.328 }, 00:17:44.328 "claimed": true, 00:17:44.328 "claim_type": "exclusive_write", 00:17:44.328 "zoned": false, 00:17:44.328 "supported_io_types": { 00:17:44.328 "read": true, 00:17:44.328 "write": true, 00:17:44.328 "unmap": true, 00:17:44.328 "flush": true, 00:17:44.328 "reset": true, 00:17:44.328 "nvme_admin": false, 00:17:44.328 "nvme_io": false, 00:17:44.328 "nvme_io_md": false, 00:17:44.328 "write_zeroes": true, 00:17:44.328 "zcopy": true, 00:17:44.328 "get_zone_info": false, 00:17:44.328 "zone_management": false, 00:17:44.328 "zone_append": false, 00:17:44.328 "compare": false, 00:17:44.328 "compare_and_write": false, 00:17:44.328 "abort": true, 00:17:44.328 "seek_hole": false, 00:17:44.328 "seek_data": false, 00:17:44.328 "copy": true, 00:17:44.328 "nvme_iov_md": false 00:17:44.328 }, 00:17:44.328 "memory_domains": [ 00:17:44.328 { 00:17:44.328 "dma_device_id": "system", 00:17:44.328 "dma_device_type": 1 00:17:44.328 }, 00:17:44.328 { 00:17:44.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.328 "dma_device_type": 2 00:17:44.328 } 00:17:44.328 ], 00:17:44.328 "driver_specific": {} 00:17:44.328 }' 00:17:44.328 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.611 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.611 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:44.611 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.611 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.611 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:44.611 07:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.611 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.611 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.611 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.611 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.870 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.870 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:44.870 [2024-07-25 07:23:17.379990] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:44.870 [2024-07-25 07:23:17.380014] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:44.870 [2024-07-25 07:23:17.380063] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:44.870 [2024-07-25 07:23:17.380309] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:44.870 [2024-07-25 07:23:17.380321] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8f3a70 name Existed_Raid, state offline 00:17:44.870 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1643056 00:17:44.870 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1643056 ']' 00:17:44.870 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1643056 00:17:44.870 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1643056 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1643056' 00:17:45.130 killing process with pid 1643056 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1643056 00:17:45.130 [2024-07-25 07:23:17.455837] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:45.130 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1643056 00:17:45.130 [2024-07-25 07:23:17.479544] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:45.390 00:17:45.390 real 0m28.099s 00:17:45.390 user 0m51.679s 00:17:45.390 sys 0m4.912s 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.390 ************************************ 00:17:45.390 END TEST raid_state_function_test 00:17:45.390 ************************************ 00:17:45.390 07:23:17 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:45.390 07:23:17 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:45.390 07:23:17 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:45.390 07:23:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:45.390 ************************************ 00:17:45.390 START TEST raid_state_function_test_sb 00:17:45.390 ************************************ 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 true 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1648398 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1648398' 00:17:45.390 Process raid pid: 1648398 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1648398 /var/tmp/spdk-raid.sock 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1648398 ']' 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:45.390 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:45.390 07:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:45.390 [2024-07-25 07:23:17.822238] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:17:45.390 [2024-07-25 07:23:17.822302] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:45.390 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.390 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:45.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:45.391 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:45.650 [2024-07-25 07:23:17.953374] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:45.650 [2024-07-25 07:23:18.036545] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:45.650 [2024-07-25 07:23:18.096293] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:45.650 [2024-07-25 07:23:18.096328] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:46.218 07:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:46.218 07:23:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:17:46.218 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:46.478 [2024-07-25 07:23:18.930522] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:46.478 [2024-07-25 07:23:18.930558] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:46.478 [2024-07-25 07:23:18.930568] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:46.478 [2024-07-25 07:23:18.930579] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:46.478 [2024-07-25 07:23:18.930587] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:46.478 [2024-07-25 07:23:18.930597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.478 07:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.737 07:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.737 "name": "Existed_Raid", 00:17:46.737 "uuid": "5b3f3613-0dd9-4205-8b13-70a7bffb25ca", 00:17:46.737 "strip_size_kb": 0, 00:17:46.737 "state": "configuring", 00:17:46.737 "raid_level": "raid1", 00:17:46.737 "superblock": true, 00:17:46.737 "num_base_bdevs": 3, 00:17:46.737 "num_base_bdevs_discovered": 0, 00:17:46.737 "num_base_bdevs_operational": 3, 00:17:46.737 "base_bdevs_list": [ 00:17:46.737 { 00:17:46.737 "name": "BaseBdev1", 00:17:46.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.737 "is_configured": false, 00:17:46.737 "data_offset": 0, 00:17:46.737 "data_size": 0 00:17:46.737 }, 00:17:46.737 { 00:17:46.737 "name": "BaseBdev2", 00:17:46.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.737 "is_configured": false, 00:17:46.737 "data_offset": 0, 00:17:46.737 "data_size": 0 00:17:46.737 }, 00:17:46.737 { 00:17:46.737 "name": "BaseBdev3", 00:17:46.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:46.737 "is_configured": false, 00:17:46.737 "data_offset": 0, 00:17:46.737 "data_size": 0 00:17:46.737 } 00:17:46.737 ] 00:17:46.737 }' 00:17:46.737 07:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.737 07:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:47.674 07:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:47.933 [2024-07-25 07:23:20.225765] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:47.933 [2024-07-25 07:23:20.225797] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x191aec0 name Existed_Raid, state configuring 00:17:47.933 07:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:47.933 [2024-07-25 07:23:20.450368] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:47.933 [2024-07-25 07:23:20.450392] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:47.933 [2024-07-25 07:23:20.450402] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:47.933 [2024-07-25 07:23:20.450413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:47.933 [2024-07-25 07:23:20.450422] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:47.933 [2024-07-25 07:23:20.450432] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:48.192 [2024-07-25 07:23:20.684538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:48.192 BaseBdev1 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:48.192 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.450 07:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:48.710 [ 00:17:48.710 { 00:17:48.710 "name": "BaseBdev1", 00:17:48.710 "aliases": [ 00:17:48.710 "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089" 00:17:48.710 ], 00:17:48.710 "product_name": "Malloc disk", 00:17:48.710 "block_size": 512, 00:17:48.710 "num_blocks": 65536, 00:17:48.710 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:48.710 "assigned_rate_limits": { 00:17:48.710 "rw_ios_per_sec": 0, 00:17:48.710 "rw_mbytes_per_sec": 0, 00:17:48.710 "r_mbytes_per_sec": 0, 00:17:48.710 "w_mbytes_per_sec": 0 00:17:48.710 }, 00:17:48.710 "claimed": true, 00:17:48.710 "claim_type": "exclusive_write", 00:17:48.710 "zoned": false, 00:17:48.710 "supported_io_types": { 00:17:48.710 "read": true, 00:17:48.710 "write": true, 00:17:48.710 "unmap": true, 00:17:48.710 "flush": true, 00:17:48.710 "reset": true, 00:17:48.710 "nvme_admin": false, 00:17:48.710 "nvme_io": false, 00:17:48.710 "nvme_io_md": false, 00:17:48.710 "write_zeroes": true, 00:17:48.710 "zcopy": true, 00:17:48.710 "get_zone_info": false, 00:17:48.710 "zone_management": false, 00:17:48.710 "zone_append": false, 00:17:48.710 "compare": false, 00:17:48.710 "compare_and_write": false, 00:17:48.710 "abort": true, 00:17:48.710 "seek_hole": false, 00:17:48.710 "seek_data": false, 00:17:48.710 "copy": true, 00:17:48.710 "nvme_iov_md": false 00:17:48.710 }, 00:17:48.710 "memory_domains": [ 00:17:48.710 { 00:17:48.710 "dma_device_id": "system", 00:17:48.710 "dma_device_type": 1 00:17:48.710 }, 00:17:48.710 { 00:17:48.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.710 "dma_device_type": 2 00:17:48.710 } 00:17:48.710 ], 00:17:48.710 "driver_specific": {} 00:17:48.710 } 00:17:48.710 ] 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.710 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:48.969 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.969 "name": "Existed_Raid", 00:17:48.969 "uuid": "0d1d892b-4083-478b-a36e-7c0483ceb7e3", 00:17:48.969 "strip_size_kb": 0, 00:17:48.969 "state": "configuring", 00:17:48.969 "raid_level": "raid1", 00:17:48.969 "superblock": true, 00:17:48.970 "num_base_bdevs": 3, 00:17:48.970 "num_base_bdevs_discovered": 1, 00:17:48.970 "num_base_bdevs_operational": 3, 00:17:48.970 "base_bdevs_list": [ 00:17:48.970 { 00:17:48.970 "name": "BaseBdev1", 00:17:48.970 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:48.970 "is_configured": true, 00:17:48.970 "data_offset": 2048, 00:17:48.970 "data_size": 63488 00:17:48.970 }, 00:17:48.970 { 00:17:48.970 "name": "BaseBdev2", 00:17:48.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.970 "is_configured": false, 00:17:48.970 "data_offset": 0, 00:17:48.970 "data_size": 0 00:17:48.970 }, 00:17:48.970 { 00:17:48.970 "name": "BaseBdev3", 00:17:48.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.970 "is_configured": false, 00:17:48.970 "data_offset": 0, 00:17:48.970 "data_size": 0 00:17:48.970 } 00:17:48.970 ] 00:17:48.970 }' 00:17:48.970 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.970 07:23:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.543 07:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:49.544 [2024-07-25 07:23:22.052130] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:49.544 [2024-07-25 07:23:22.052174] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x191a790 name Existed_Raid, state configuring 00:17:49.544 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:49.805 [2024-07-25 07:23:22.280765] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:49.805 [2024-07-25 07:23:22.282121] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:49.805 [2024-07-25 07:23:22.282161] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:49.805 [2024-07-25 07:23:22.282171] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:49.805 [2024-07-25 07:23:22.282182] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.805 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.065 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.065 "name": "Existed_Raid", 00:17:50.065 "uuid": "e62f6e49-ec79-46d4-89a9-518345c32959", 00:17:50.065 "strip_size_kb": 0, 00:17:50.065 "state": "configuring", 00:17:50.065 "raid_level": "raid1", 00:17:50.065 "superblock": true, 00:17:50.065 "num_base_bdevs": 3, 00:17:50.065 "num_base_bdevs_discovered": 1, 00:17:50.065 "num_base_bdevs_operational": 3, 00:17:50.065 "base_bdevs_list": [ 00:17:50.065 { 00:17:50.065 "name": "BaseBdev1", 00:17:50.065 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:50.065 "is_configured": true, 00:17:50.065 "data_offset": 2048, 00:17:50.065 "data_size": 63488 00:17:50.065 }, 00:17:50.065 { 00:17:50.065 "name": "BaseBdev2", 00:17:50.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.065 "is_configured": false, 00:17:50.065 "data_offset": 0, 00:17:50.065 "data_size": 0 00:17:50.065 }, 00:17:50.065 { 00:17:50.065 "name": "BaseBdev3", 00:17:50.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.065 "is_configured": false, 00:17:50.065 "data_offset": 0, 00:17:50.065 "data_size": 0 00:17:50.065 } 00:17:50.065 ] 00:17:50.065 }' 00:17:50.065 07:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.065 07:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:50.633 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:50.892 [2024-07-25 07:23:23.318601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:50.892 BaseBdev2 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:50.892 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:51.151 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:51.410 [ 00:17:51.410 { 00:17:51.410 "name": "BaseBdev2", 00:17:51.410 "aliases": [ 00:17:51.410 "cf818587-f6eb-421f-a41e-6584dde14a97" 00:17:51.410 ], 00:17:51.410 "product_name": "Malloc disk", 00:17:51.410 "block_size": 512, 00:17:51.410 "num_blocks": 65536, 00:17:51.410 "uuid": "cf818587-f6eb-421f-a41e-6584dde14a97", 00:17:51.410 "assigned_rate_limits": { 00:17:51.410 "rw_ios_per_sec": 0, 00:17:51.410 "rw_mbytes_per_sec": 0, 00:17:51.410 "r_mbytes_per_sec": 0, 00:17:51.410 "w_mbytes_per_sec": 0 00:17:51.410 }, 00:17:51.410 "claimed": true, 00:17:51.410 "claim_type": "exclusive_write", 00:17:51.410 "zoned": false, 00:17:51.410 "supported_io_types": { 00:17:51.410 "read": true, 00:17:51.410 "write": true, 00:17:51.410 "unmap": true, 00:17:51.410 "flush": true, 00:17:51.410 "reset": true, 00:17:51.410 "nvme_admin": false, 00:17:51.410 "nvme_io": false, 00:17:51.410 "nvme_io_md": false, 00:17:51.410 "write_zeroes": true, 00:17:51.410 "zcopy": true, 00:17:51.410 "get_zone_info": false, 00:17:51.410 "zone_management": false, 00:17:51.410 "zone_append": false, 00:17:51.410 "compare": false, 00:17:51.410 "compare_and_write": false, 00:17:51.410 "abort": true, 00:17:51.410 "seek_hole": false, 00:17:51.410 "seek_data": false, 00:17:51.410 "copy": true, 00:17:51.410 "nvme_iov_md": false 00:17:51.410 }, 00:17:51.410 "memory_domains": [ 00:17:51.410 { 00:17:51.410 "dma_device_id": "system", 00:17:51.410 "dma_device_type": 1 00:17:51.410 }, 00:17:51.410 { 00:17:51.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.410 "dma_device_type": 2 00:17:51.410 } 00:17:51.410 ], 00:17:51.410 "driver_specific": {} 00:17:51.410 } 00:17:51.410 ] 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.410 07:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.669 07:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.669 "name": "Existed_Raid", 00:17:51.669 "uuid": "e62f6e49-ec79-46d4-89a9-518345c32959", 00:17:51.669 "strip_size_kb": 0, 00:17:51.669 "state": "configuring", 00:17:51.669 "raid_level": "raid1", 00:17:51.669 "superblock": true, 00:17:51.669 "num_base_bdevs": 3, 00:17:51.669 "num_base_bdevs_discovered": 2, 00:17:51.669 "num_base_bdevs_operational": 3, 00:17:51.669 "base_bdevs_list": [ 00:17:51.669 { 00:17:51.669 "name": "BaseBdev1", 00:17:51.669 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:51.669 "is_configured": true, 00:17:51.669 "data_offset": 2048, 00:17:51.669 "data_size": 63488 00:17:51.669 }, 00:17:51.669 { 00:17:51.669 "name": "BaseBdev2", 00:17:51.669 "uuid": "cf818587-f6eb-421f-a41e-6584dde14a97", 00:17:51.669 "is_configured": true, 00:17:51.669 "data_offset": 2048, 00:17:51.669 "data_size": 63488 00:17:51.669 }, 00:17:51.669 { 00:17:51.669 "name": "BaseBdev3", 00:17:51.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.669 "is_configured": false, 00:17:51.669 "data_offset": 0, 00:17:51.669 "data_size": 0 00:17:51.669 } 00:17:51.669 ] 00:17:51.669 }' 00:17:51.669 07:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.669 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:52.237 07:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:52.496 [2024-07-25 07:23:24.785620] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:52.496 [2024-07-25 07:23:24.785766] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x191b680 00:17:52.496 [2024-07-25 07:23:24.785779] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:52.496 [2024-07-25 07:23:24.785935] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x191b350 00:17:52.496 [2024-07-25 07:23:24.786056] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x191b680 00:17:52.496 [2024-07-25 07:23:24.786065] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x191b680 00:17:52.496 [2024-07-25 07:23:24.786161] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:52.496 BaseBdev3 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:52.496 07:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.496 07:23:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:52.756 [ 00:17:52.756 { 00:17:52.756 "name": "BaseBdev3", 00:17:52.756 "aliases": [ 00:17:52.756 "23ec435c-9b95-42c8-bf63-3511dfa91792" 00:17:52.756 ], 00:17:52.756 "product_name": "Malloc disk", 00:17:52.756 "block_size": 512, 00:17:52.756 "num_blocks": 65536, 00:17:52.756 "uuid": "23ec435c-9b95-42c8-bf63-3511dfa91792", 00:17:52.756 "assigned_rate_limits": { 00:17:52.756 "rw_ios_per_sec": 0, 00:17:52.756 "rw_mbytes_per_sec": 0, 00:17:52.756 "r_mbytes_per_sec": 0, 00:17:52.756 "w_mbytes_per_sec": 0 00:17:52.756 }, 00:17:52.756 "claimed": true, 00:17:52.756 "claim_type": "exclusive_write", 00:17:52.756 "zoned": false, 00:17:52.756 "supported_io_types": { 00:17:52.756 "read": true, 00:17:52.756 "write": true, 00:17:52.756 "unmap": true, 00:17:52.756 "flush": true, 00:17:52.756 "reset": true, 00:17:52.756 "nvme_admin": false, 00:17:52.756 "nvme_io": false, 00:17:52.756 "nvme_io_md": false, 00:17:52.756 "write_zeroes": true, 00:17:52.756 "zcopy": true, 00:17:52.756 "get_zone_info": false, 00:17:52.756 "zone_management": false, 00:17:52.756 "zone_append": false, 00:17:52.756 "compare": false, 00:17:52.756 "compare_and_write": false, 00:17:52.756 "abort": true, 00:17:52.756 "seek_hole": false, 00:17:52.756 "seek_data": false, 00:17:52.756 "copy": true, 00:17:52.756 "nvme_iov_md": false 00:17:52.756 }, 00:17:52.756 "memory_domains": [ 00:17:52.756 { 00:17:52.756 "dma_device_id": "system", 00:17:52.756 "dma_device_type": 1 00:17:52.756 }, 00:17:52.756 { 00:17:52.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.756 "dma_device_type": 2 00:17:52.756 } 00:17:52.756 ], 00:17:52.756 "driver_specific": {} 00:17:52.756 } 00:17:52.756 ] 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.756 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.015 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.015 "name": "Existed_Raid", 00:17:53.015 "uuid": "e62f6e49-ec79-46d4-89a9-518345c32959", 00:17:53.015 "strip_size_kb": 0, 00:17:53.015 "state": "online", 00:17:53.015 "raid_level": "raid1", 00:17:53.015 "superblock": true, 00:17:53.015 "num_base_bdevs": 3, 00:17:53.015 "num_base_bdevs_discovered": 3, 00:17:53.015 "num_base_bdevs_operational": 3, 00:17:53.015 "base_bdevs_list": [ 00:17:53.015 { 00:17:53.015 "name": "BaseBdev1", 00:17:53.015 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:53.015 "is_configured": true, 00:17:53.015 "data_offset": 2048, 00:17:53.015 "data_size": 63488 00:17:53.015 }, 00:17:53.015 { 00:17:53.015 "name": "BaseBdev2", 00:17:53.015 "uuid": "cf818587-f6eb-421f-a41e-6584dde14a97", 00:17:53.015 "is_configured": true, 00:17:53.015 "data_offset": 2048, 00:17:53.015 "data_size": 63488 00:17:53.015 }, 00:17:53.016 { 00:17:53.016 "name": "BaseBdev3", 00:17:53.016 "uuid": "23ec435c-9b95-42c8-bf63-3511dfa91792", 00:17:53.016 "is_configured": true, 00:17:53.016 "data_offset": 2048, 00:17:53.016 "data_size": 63488 00:17:53.016 } 00:17:53.016 ] 00:17:53.016 }' 00:17:53.016 07:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.016 07:23:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:53.584 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:53.843 [2024-07-25 07:23:26.261965] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:53.843 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:53.843 "name": "Existed_Raid", 00:17:53.843 "aliases": [ 00:17:53.843 "e62f6e49-ec79-46d4-89a9-518345c32959" 00:17:53.843 ], 00:17:53.843 "product_name": "Raid Volume", 00:17:53.843 "block_size": 512, 00:17:53.843 "num_blocks": 63488, 00:17:53.843 "uuid": "e62f6e49-ec79-46d4-89a9-518345c32959", 00:17:53.843 "assigned_rate_limits": { 00:17:53.843 "rw_ios_per_sec": 0, 00:17:53.843 "rw_mbytes_per_sec": 0, 00:17:53.843 "r_mbytes_per_sec": 0, 00:17:53.843 "w_mbytes_per_sec": 0 00:17:53.843 }, 00:17:53.843 "claimed": false, 00:17:53.843 "zoned": false, 00:17:53.843 "supported_io_types": { 00:17:53.843 "read": true, 00:17:53.843 "write": true, 00:17:53.843 "unmap": false, 00:17:53.843 "flush": false, 00:17:53.843 "reset": true, 00:17:53.843 "nvme_admin": false, 00:17:53.843 "nvme_io": false, 00:17:53.843 "nvme_io_md": false, 00:17:53.843 "write_zeroes": true, 00:17:53.843 "zcopy": false, 00:17:53.843 "get_zone_info": false, 00:17:53.843 "zone_management": false, 00:17:53.843 "zone_append": false, 00:17:53.843 "compare": false, 00:17:53.843 "compare_and_write": false, 00:17:53.843 "abort": false, 00:17:53.843 "seek_hole": false, 00:17:53.843 "seek_data": false, 00:17:53.843 "copy": false, 00:17:53.843 "nvme_iov_md": false 00:17:53.843 }, 00:17:53.843 "memory_domains": [ 00:17:53.843 { 00:17:53.843 "dma_device_id": "system", 00:17:53.843 "dma_device_type": 1 00:17:53.843 }, 00:17:53.843 { 00:17:53.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.843 "dma_device_type": 2 00:17:53.843 }, 00:17:53.843 { 00:17:53.843 "dma_device_id": "system", 00:17:53.843 "dma_device_type": 1 00:17:53.843 }, 00:17:53.843 { 00:17:53.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.843 "dma_device_type": 2 00:17:53.843 }, 00:17:53.843 { 00:17:53.843 "dma_device_id": "system", 00:17:53.843 "dma_device_type": 1 00:17:53.843 }, 00:17:53.843 { 00:17:53.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.843 "dma_device_type": 2 00:17:53.843 } 00:17:53.843 ], 00:17:53.843 "driver_specific": { 00:17:53.843 "raid": { 00:17:53.843 "uuid": "e62f6e49-ec79-46d4-89a9-518345c32959", 00:17:53.843 "strip_size_kb": 0, 00:17:53.843 "state": "online", 00:17:53.844 "raid_level": "raid1", 00:17:53.844 "superblock": true, 00:17:53.844 "num_base_bdevs": 3, 00:17:53.844 "num_base_bdevs_discovered": 3, 00:17:53.844 "num_base_bdevs_operational": 3, 00:17:53.844 "base_bdevs_list": [ 00:17:53.844 { 00:17:53.844 "name": "BaseBdev1", 00:17:53.844 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:53.844 "is_configured": true, 00:17:53.844 "data_offset": 2048, 00:17:53.844 "data_size": 63488 00:17:53.844 }, 00:17:53.844 { 00:17:53.844 "name": "BaseBdev2", 00:17:53.844 "uuid": "cf818587-f6eb-421f-a41e-6584dde14a97", 00:17:53.844 "is_configured": true, 00:17:53.844 "data_offset": 2048, 00:17:53.844 "data_size": 63488 00:17:53.844 }, 00:17:53.844 { 00:17:53.844 "name": "BaseBdev3", 00:17:53.844 "uuid": "23ec435c-9b95-42c8-bf63-3511dfa91792", 00:17:53.844 "is_configured": true, 00:17:53.844 "data_offset": 2048, 00:17:53.844 "data_size": 63488 00:17:53.844 } 00:17:53.844 ] 00:17:53.844 } 00:17:53.844 } 00:17:53.844 }' 00:17:53.844 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:53.844 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:53.844 BaseBdev2 00:17:53.844 BaseBdev3' 00:17:53.844 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.844 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:53.844 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.103 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.103 "name": "BaseBdev1", 00:17:54.103 "aliases": [ 00:17:54.103 "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089" 00:17:54.103 ], 00:17:54.103 "product_name": "Malloc disk", 00:17:54.103 "block_size": 512, 00:17:54.103 "num_blocks": 65536, 00:17:54.103 "uuid": "b4a7a74f-9fb0-4af4-b3a0-9d27df2b5089", 00:17:54.103 "assigned_rate_limits": { 00:17:54.103 "rw_ios_per_sec": 0, 00:17:54.103 "rw_mbytes_per_sec": 0, 00:17:54.103 "r_mbytes_per_sec": 0, 00:17:54.103 "w_mbytes_per_sec": 0 00:17:54.103 }, 00:17:54.103 "claimed": true, 00:17:54.103 "claim_type": "exclusive_write", 00:17:54.103 "zoned": false, 00:17:54.103 "supported_io_types": { 00:17:54.103 "read": true, 00:17:54.103 "write": true, 00:17:54.103 "unmap": true, 00:17:54.103 "flush": true, 00:17:54.103 "reset": true, 00:17:54.103 "nvme_admin": false, 00:17:54.103 "nvme_io": false, 00:17:54.103 "nvme_io_md": false, 00:17:54.103 "write_zeroes": true, 00:17:54.103 "zcopy": true, 00:17:54.103 "get_zone_info": false, 00:17:54.103 "zone_management": false, 00:17:54.104 "zone_append": false, 00:17:54.104 "compare": false, 00:17:54.104 "compare_and_write": false, 00:17:54.104 "abort": true, 00:17:54.104 "seek_hole": false, 00:17:54.104 "seek_data": false, 00:17:54.104 "copy": true, 00:17:54.104 "nvme_iov_md": false 00:17:54.104 }, 00:17:54.104 "memory_domains": [ 00:17:54.104 { 00:17:54.104 "dma_device_id": "system", 00:17:54.104 "dma_device_type": 1 00:17:54.104 }, 00:17:54.104 { 00:17:54.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.104 "dma_device_type": 2 00:17:54.104 } 00:17:54.104 ], 00:17:54.104 "driver_specific": {} 00:17:54.104 }' 00:17:54.104 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.104 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.363 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.622 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.622 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:54.622 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:54.622 07:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.622 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.622 "name": "BaseBdev2", 00:17:54.622 "aliases": [ 00:17:54.622 "cf818587-f6eb-421f-a41e-6584dde14a97" 00:17:54.622 ], 00:17:54.622 "product_name": "Malloc disk", 00:17:54.622 "block_size": 512, 00:17:54.622 "num_blocks": 65536, 00:17:54.622 "uuid": "cf818587-f6eb-421f-a41e-6584dde14a97", 00:17:54.622 "assigned_rate_limits": { 00:17:54.622 "rw_ios_per_sec": 0, 00:17:54.622 "rw_mbytes_per_sec": 0, 00:17:54.622 "r_mbytes_per_sec": 0, 00:17:54.622 "w_mbytes_per_sec": 0 00:17:54.622 }, 00:17:54.622 "claimed": true, 00:17:54.622 "claim_type": "exclusive_write", 00:17:54.622 "zoned": false, 00:17:54.622 "supported_io_types": { 00:17:54.622 "read": true, 00:17:54.622 "write": true, 00:17:54.622 "unmap": true, 00:17:54.622 "flush": true, 00:17:54.622 "reset": true, 00:17:54.622 "nvme_admin": false, 00:17:54.622 "nvme_io": false, 00:17:54.622 "nvme_io_md": false, 00:17:54.622 "write_zeroes": true, 00:17:54.622 "zcopy": true, 00:17:54.622 "get_zone_info": false, 00:17:54.622 "zone_management": false, 00:17:54.622 "zone_append": false, 00:17:54.622 "compare": false, 00:17:54.622 "compare_and_write": false, 00:17:54.622 "abort": true, 00:17:54.622 "seek_hole": false, 00:17:54.622 "seek_data": false, 00:17:54.622 "copy": true, 00:17:54.622 "nvme_iov_md": false 00:17:54.622 }, 00:17:54.622 "memory_domains": [ 00:17:54.622 { 00:17:54.622 "dma_device_id": "system", 00:17:54.622 "dma_device_type": 1 00:17:54.622 }, 00:17:54.622 { 00:17:54.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.622 "dma_device_type": 2 00:17:54.622 } 00:17:54.622 ], 00:17:54.622 "driver_specific": {} 00:17:54.622 }' 00:17:54.622 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.881 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:55.140 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:55.140 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:55.140 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:55.140 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:55.140 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:55.399 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:55.399 "name": "BaseBdev3", 00:17:55.399 "aliases": [ 00:17:55.399 "23ec435c-9b95-42c8-bf63-3511dfa91792" 00:17:55.399 ], 00:17:55.399 "product_name": "Malloc disk", 00:17:55.399 "block_size": 512, 00:17:55.399 "num_blocks": 65536, 00:17:55.399 "uuid": "23ec435c-9b95-42c8-bf63-3511dfa91792", 00:17:55.399 "assigned_rate_limits": { 00:17:55.399 "rw_ios_per_sec": 0, 00:17:55.399 "rw_mbytes_per_sec": 0, 00:17:55.399 "r_mbytes_per_sec": 0, 00:17:55.399 "w_mbytes_per_sec": 0 00:17:55.399 }, 00:17:55.399 "claimed": true, 00:17:55.399 "claim_type": "exclusive_write", 00:17:55.399 "zoned": false, 00:17:55.399 "supported_io_types": { 00:17:55.399 "read": true, 00:17:55.399 "write": true, 00:17:55.399 "unmap": true, 00:17:55.399 "flush": true, 00:17:55.399 "reset": true, 00:17:55.399 "nvme_admin": false, 00:17:55.399 "nvme_io": false, 00:17:55.399 "nvme_io_md": false, 00:17:55.399 "write_zeroes": true, 00:17:55.399 "zcopy": true, 00:17:55.399 "get_zone_info": false, 00:17:55.399 "zone_management": false, 00:17:55.399 "zone_append": false, 00:17:55.399 "compare": false, 00:17:55.399 "compare_and_write": false, 00:17:55.399 "abort": true, 00:17:55.399 "seek_hole": false, 00:17:55.399 "seek_data": false, 00:17:55.399 "copy": true, 00:17:55.399 "nvme_iov_md": false 00:17:55.400 }, 00:17:55.400 "memory_domains": [ 00:17:55.400 { 00:17:55.400 "dma_device_id": "system", 00:17:55.400 "dma_device_type": 1 00:17:55.400 }, 00:17:55.400 { 00:17:55.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:55.400 "dma_device_type": 2 00:17:55.400 } 00:17:55.400 ], 00:17:55.400 "driver_specific": {} 00:17:55.400 }' 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:55.400 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:55.659 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:55.659 07:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:55.659 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:55.659 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:55.659 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:55.917 [2024-07-25 07:23:28.271005] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.917 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.177 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.177 "name": "Existed_Raid", 00:17:56.177 "uuid": "e62f6e49-ec79-46d4-89a9-518345c32959", 00:17:56.177 "strip_size_kb": 0, 00:17:56.177 "state": "online", 00:17:56.177 "raid_level": "raid1", 00:17:56.177 "superblock": true, 00:17:56.177 "num_base_bdevs": 3, 00:17:56.177 "num_base_bdevs_discovered": 2, 00:17:56.177 "num_base_bdevs_operational": 2, 00:17:56.177 "base_bdevs_list": [ 00:17:56.177 { 00:17:56.177 "name": null, 00:17:56.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.177 "is_configured": false, 00:17:56.177 "data_offset": 2048, 00:17:56.177 "data_size": 63488 00:17:56.177 }, 00:17:56.177 { 00:17:56.177 "name": "BaseBdev2", 00:17:56.177 "uuid": "cf818587-f6eb-421f-a41e-6584dde14a97", 00:17:56.177 "is_configured": true, 00:17:56.177 "data_offset": 2048, 00:17:56.177 "data_size": 63488 00:17:56.177 }, 00:17:56.177 { 00:17:56.177 "name": "BaseBdev3", 00:17:56.177 "uuid": "23ec435c-9b95-42c8-bf63-3511dfa91792", 00:17:56.177 "is_configured": true, 00:17:56.177 "data_offset": 2048, 00:17:56.177 "data_size": 63488 00:17:56.177 } 00:17:56.177 ] 00:17:56.177 }' 00:17:56.177 07:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.177 07:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.746 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:56.746 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:56.746 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.746 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:57.006 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:57.006 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:57.006 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:57.006 [2024-07-25 07:23:29.523338] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:57.320 07:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:57.592 [2024-07-25 07:23:29.974805] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:57.592 [2024-07-25 07:23:29.974886] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:57.592 [2024-07-25 07:23:29.985066] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:57.592 [2024-07-25 07:23:29.985095] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:57.592 [2024-07-25 07:23:29.985106] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x191b680 name Existed_Raid, state offline 00:17:57.592 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:57.592 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:57.592 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.592 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:57.852 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:57.852 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:57.852 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:57.852 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:57.852 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:57.852 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:58.112 BaseBdev2 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:58.112 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:58.371 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:58.371 [ 00:17:58.371 { 00:17:58.371 "name": "BaseBdev2", 00:17:58.371 "aliases": [ 00:17:58.371 "0a7849d0-3966-45e7-82e4-4e7dcf26742d" 00:17:58.371 ], 00:17:58.371 "product_name": "Malloc disk", 00:17:58.371 "block_size": 512, 00:17:58.371 "num_blocks": 65536, 00:17:58.371 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:17:58.371 "assigned_rate_limits": { 00:17:58.371 "rw_ios_per_sec": 0, 00:17:58.371 "rw_mbytes_per_sec": 0, 00:17:58.371 "r_mbytes_per_sec": 0, 00:17:58.371 "w_mbytes_per_sec": 0 00:17:58.371 }, 00:17:58.371 "claimed": false, 00:17:58.371 "zoned": false, 00:17:58.371 "supported_io_types": { 00:17:58.371 "read": true, 00:17:58.371 "write": true, 00:17:58.372 "unmap": true, 00:17:58.372 "flush": true, 00:17:58.372 "reset": true, 00:17:58.372 "nvme_admin": false, 00:17:58.372 "nvme_io": false, 00:17:58.372 "nvme_io_md": false, 00:17:58.372 "write_zeroes": true, 00:17:58.372 "zcopy": true, 00:17:58.372 "get_zone_info": false, 00:17:58.372 "zone_management": false, 00:17:58.372 "zone_append": false, 00:17:58.372 "compare": false, 00:17:58.372 "compare_and_write": false, 00:17:58.372 "abort": true, 00:17:58.372 "seek_hole": false, 00:17:58.372 "seek_data": false, 00:17:58.372 "copy": true, 00:17:58.372 "nvme_iov_md": false 00:17:58.372 }, 00:17:58.372 "memory_domains": [ 00:17:58.372 { 00:17:58.372 "dma_device_id": "system", 00:17:58.372 "dma_device_type": 1 00:17:58.372 }, 00:17:58.372 { 00:17:58.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.372 "dma_device_type": 2 00:17:58.372 } 00:17:58.372 ], 00:17:58.372 "driver_specific": {} 00:17:58.372 } 00:17:58.372 ] 00:17:58.631 07:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:58.631 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:58.631 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:58.631 07:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:58.631 BaseBdev3 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:58.631 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:58.890 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:59.150 [ 00:17:59.150 { 00:17:59.150 "name": "BaseBdev3", 00:17:59.150 "aliases": [ 00:17:59.150 "682e5b77-e952-4b9f-90e6-3bac4ccef922" 00:17:59.150 ], 00:17:59.150 "product_name": "Malloc disk", 00:17:59.150 "block_size": 512, 00:17:59.150 "num_blocks": 65536, 00:17:59.150 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:17:59.150 "assigned_rate_limits": { 00:17:59.150 "rw_ios_per_sec": 0, 00:17:59.150 "rw_mbytes_per_sec": 0, 00:17:59.150 "r_mbytes_per_sec": 0, 00:17:59.150 "w_mbytes_per_sec": 0 00:17:59.150 }, 00:17:59.150 "claimed": false, 00:17:59.150 "zoned": false, 00:17:59.150 "supported_io_types": { 00:17:59.150 "read": true, 00:17:59.150 "write": true, 00:17:59.150 "unmap": true, 00:17:59.150 "flush": true, 00:17:59.150 "reset": true, 00:17:59.150 "nvme_admin": false, 00:17:59.150 "nvme_io": false, 00:17:59.150 "nvme_io_md": false, 00:17:59.150 "write_zeroes": true, 00:17:59.150 "zcopy": true, 00:17:59.150 "get_zone_info": false, 00:17:59.150 "zone_management": false, 00:17:59.150 "zone_append": false, 00:17:59.150 "compare": false, 00:17:59.150 "compare_and_write": false, 00:17:59.150 "abort": true, 00:17:59.150 "seek_hole": false, 00:17:59.150 "seek_data": false, 00:17:59.150 "copy": true, 00:17:59.150 "nvme_iov_md": false 00:17:59.150 }, 00:17:59.150 "memory_domains": [ 00:17:59.150 { 00:17:59.150 "dma_device_id": "system", 00:17:59.150 "dma_device_type": 1 00:17:59.150 }, 00:17:59.150 { 00:17:59.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.150 "dma_device_type": 2 00:17:59.150 } 00:17:59.150 ], 00:17:59.150 "driver_specific": {} 00:17:59.150 } 00:17:59.150 ] 00:17:59.150 07:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:59.150 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:59.150 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:59.150 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:59.409 [2024-07-25 07:23:31.834185] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:59.409 [2024-07-25 07:23:31.834224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:59.409 [2024-07-25 07:23:31.834242] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:59.409 [2024-07-25 07:23:31.835457] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.409 07:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.669 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.669 "name": "Existed_Raid", 00:17:59.669 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:17:59.669 "strip_size_kb": 0, 00:17:59.669 "state": "configuring", 00:17:59.669 "raid_level": "raid1", 00:17:59.669 "superblock": true, 00:17:59.669 "num_base_bdevs": 3, 00:17:59.669 "num_base_bdevs_discovered": 2, 00:17:59.669 "num_base_bdevs_operational": 3, 00:17:59.669 "base_bdevs_list": [ 00:17:59.669 { 00:17:59.669 "name": "BaseBdev1", 00:17:59.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.669 "is_configured": false, 00:17:59.669 "data_offset": 0, 00:17:59.669 "data_size": 0 00:17:59.669 }, 00:17:59.669 { 00:17:59.669 "name": "BaseBdev2", 00:17:59.669 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:17:59.669 "is_configured": true, 00:17:59.669 "data_offset": 2048, 00:17:59.669 "data_size": 63488 00:17:59.669 }, 00:17:59.669 { 00:17:59.669 "name": "BaseBdev3", 00:17:59.669 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:17:59.669 "is_configured": true, 00:17:59.669 "data_offset": 2048, 00:17:59.669 "data_size": 63488 00:17:59.669 } 00:17:59.669 ] 00:17:59.669 }' 00:17:59.669 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.669 07:23:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:00.238 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:00.497 [2024-07-25 07:23:32.844817] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.497 07:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.756 07:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.756 "name": "Existed_Raid", 00:18:00.756 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:00.756 "strip_size_kb": 0, 00:18:00.756 "state": "configuring", 00:18:00.756 "raid_level": "raid1", 00:18:00.756 "superblock": true, 00:18:00.756 "num_base_bdevs": 3, 00:18:00.756 "num_base_bdevs_discovered": 1, 00:18:00.756 "num_base_bdevs_operational": 3, 00:18:00.756 "base_bdevs_list": [ 00:18:00.756 { 00:18:00.756 "name": "BaseBdev1", 00:18:00.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.756 "is_configured": false, 00:18:00.756 "data_offset": 0, 00:18:00.756 "data_size": 0 00:18:00.756 }, 00:18:00.756 { 00:18:00.756 "name": null, 00:18:00.756 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:00.756 "is_configured": false, 00:18:00.756 "data_offset": 2048, 00:18:00.756 "data_size": 63488 00:18:00.756 }, 00:18:00.756 { 00:18:00.756 "name": "BaseBdev3", 00:18:00.756 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:00.756 "is_configured": true, 00:18:00.756 "data_offset": 2048, 00:18:00.756 "data_size": 63488 00:18:00.756 } 00:18:00.756 ] 00:18:00.756 }' 00:18:00.756 07:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.756 07:23:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:01.323 07:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.323 07:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:01.583 07:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:01.583 07:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:01.842 [2024-07-25 07:23:34.123274] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:01.842 BaseBdev1 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:01.842 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:02.102 [ 00:18:02.102 { 00:18:02.102 "name": "BaseBdev1", 00:18:02.102 "aliases": [ 00:18:02.102 "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2" 00:18:02.102 ], 00:18:02.102 "product_name": "Malloc disk", 00:18:02.102 "block_size": 512, 00:18:02.102 "num_blocks": 65536, 00:18:02.102 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:02.102 "assigned_rate_limits": { 00:18:02.102 "rw_ios_per_sec": 0, 00:18:02.102 "rw_mbytes_per_sec": 0, 00:18:02.102 "r_mbytes_per_sec": 0, 00:18:02.102 "w_mbytes_per_sec": 0 00:18:02.102 }, 00:18:02.102 "claimed": true, 00:18:02.102 "claim_type": "exclusive_write", 00:18:02.102 "zoned": false, 00:18:02.102 "supported_io_types": { 00:18:02.102 "read": true, 00:18:02.102 "write": true, 00:18:02.102 "unmap": true, 00:18:02.102 "flush": true, 00:18:02.102 "reset": true, 00:18:02.102 "nvme_admin": false, 00:18:02.102 "nvme_io": false, 00:18:02.102 "nvme_io_md": false, 00:18:02.102 "write_zeroes": true, 00:18:02.102 "zcopy": true, 00:18:02.102 "get_zone_info": false, 00:18:02.102 "zone_management": false, 00:18:02.102 "zone_append": false, 00:18:02.102 "compare": false, 00:18:02.102 "compare_and_write": false, 00:18:02.102 "abort": true, 00:18:02.102 "seek_hole": false, 00:18:02.102 "seek_data": false, 00:18:02.102 "copy": true, 00:18:02.102 "nvme_iov_md": false 00:18:02.102 }, 00:18:02.102 "memory_domains": [ 00:18:02.102 { 00:18:02.102 "dma_device_id": "system", 00:18:02.102 "dma_device_type": 1 00:18:02.102 }, 00:18:02.102 { 00:18:02.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.102 "dma_device_type": 2 00:18:02.102 } 00:18:02.102 ], 00:18:02.102 "driver_specific": {} 00:18:02.102 } 00:18:02.102 ] 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.102 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:02.362 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:02.362 "name": "Existed_Raid", 00:18:02.362 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:02.362 "strip_size_kb": 0, 00:18:02.362 "state": "configuring", 00:18:02.362 "raid_level": "raid1", 00:18:02.362 "superblock": true, 00:18:02.362 "num_base_bdevs": 3, 00:18:02.362 "num_base_bdevs_discovered": 2, 00:18:02.362 "num_base_bdevs_operational": 3, 00:18:02.362 "base_bdevs_list": [ 00:18:02.362 { 00:18:02.362 "name": "BaseBdev1", 00:18:02.362 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:02.362 "is_configured": true, 00:18:02.362 "data_offset": 2048, 00:18:02.362 "data_size": 63488 00:18:02.362 }, 00:18:02.362 { 00:18:02.362 "name": null, 00:18:02.362 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:02.362 "is_configured": false, 00:18:02.362 "data_offset": 2048, 00:18:02.362 "data_size": 63488 00:18:02.362 }, 00:18:02.362 { 00:18:02.362 "name": "BaseBdev3", 00:18:02.362 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:02.362 "is_configured": true, 00:18:02.362 "data_offset": 2048, 00:18:02.362 "data_size": 63488 00:18:02.362 } 00:18:02.362 ] 00:18:02.362 }' 00:18:02.362 07:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:02.362 07:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:02.932 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.932 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:03.193 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:03.193 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:03.453 [2024-07-25 07:23:35.811748] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.453 07:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.712 07:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.712 "name": "Existed_Raid", 00:18:03.712 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:03.712 "strip_size_kb": 0, 00:18:03.712 "state": "configuring", 00:18:03.712 "raid_level": "raid1", 00:18:03.712 "superblock": true, 00:18:03.712 "num_base_bdevs": 3, 00:18:03.712 "num_base_bdevs_discovered": 1, 00:18:03.712 "num_base_bdevs_operational": 3, 00:18:03.712 "base_bdevs_list": [ 00:18:03.712 { 00:18:03.712 "name": "BaseBdev1", 00:18:03.712 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:03.712 "is_configured": true, 00:18:03.712 "data_offset": 2048, 00:18:03.712 "data_size": 63488 00:18:03.712 }, 00:18:03.712 { 00:18:03.712 "name": null, 00:18:03.712 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:03.712 "is_configured": false, 00:18:03.712 "data_offset": 2048, 00:18:03.712 "data_size": 63488 00:18:03.712 }, 00:18:03.712 { 00:18:03.712 "name": null, 00:18:03.712 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:03.712 "is_configured": false, 00:18:03.712 "data_offset": 2048, 00:18:03.712 "data_size": 63488 00:18:03.712 } 00:18:03.712 ] 00:18:03.712 }' 00:18:03.712 07:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.712 07:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.280 07:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.280 07:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:04.539 07:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:04.539 07:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:04.799 [2024-07-25 07:23:37.079104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.799 "name": "Existed_Raid", 00:18:04.799 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:04.799 "strip_size_kb": 0, 00:18:04.799 "state": "configuring", 00:18:04.799 "raid_level": "raid1", 00:18:04.799 "superblock": true, 00:18:04.799 "num_base_bdevs": 3, 00:18:04.799 "num_base_bdevs_discovered": 2, 00:18:04.799 "num_base_bdevs_operational": 3, 00:18:04.799 "base_bdevs_list": [ 00:18:04.799 { 00:18:04.799 "name": "BaseBdev1", 00:18:04.799 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:04.799 "is_configured": true, 00:18:04.799 "data_offset": 2048, 00:18:04.799 "data_size": 63488 00:18:04.799 }, 00:18:04.799 { 00:18:04.799 "name": null, 00:18:04.799 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:04.799 "is_configured": false, 00:18:04.799 "data_offset": 2048, 00:18:04.799 "data_size": 63488 00:18:04.799 }, 00:18:04.799 { 00:18:04.799 "name": "BaseBdev3", 00:18:04.799 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:04.799 "is_configured": true, 00:18:04.799 "data_offset": 2048, 00:18:04.799 "data_size": 63488 00:18:04.799 } 00:18:04.799 ] 00:18:04.799 }' 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.799 07:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:05.366 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.366 07:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:05.624 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:05.624 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:05.883 [2024-07-25 07:23:38.318393] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.883 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.143 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.143 "name": "Existed_Raid", 00:18:06.143 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:06.143 "strip_size_kb": 0, 00:18:06.143 "state": "configuring", 00:18:06.143 "raid_level": "raid1", 00:18:06.143 "superblock": true, 00:18:06.143 "num_base_bdevs": 3, 00:18:06.143 "num_base_bdevs_discovered": 1, 00:18:06.143 "num_base_bdevs_operational": 3, 00:18:06.143 "base_bdevs_list": [ 00:18:06.143 { 00:18:06.143 "name": null, 00:18:06.143 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:06.143 "is_configured": false, 00:18:06.143 "data_offset": 2048, 00:18:06.143 "data_size": 63488 00:18:06.143 }, 00:18:06.143 { 00:18:06.143 "name": null, 00:18:06.143 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:06.143 "is_configured": false, 00:18:06.143 "data_offset": 2048, 00:18:06.143 "data_size": 63488 00:18:06.143 }, 00:18:06.143 { 00:18:06.143 "name": "BaseBdev3", 00:18:06.143 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:06.143 "is_configured": true, 00:18:06.143 "data_offset": 2048, 00:18:06.143 "data_size": 63488 00:18:06.143 } 00:18:06.143 ] 00:18:06.143 }' 00:18:06.143 07:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.143 07:23:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.710 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:06.710 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.969 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:06.969 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:07.228 [2024-07-25 07:23:39.547818] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.228 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.487 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.487 "name": "Existed_Raid", 00:18:07.487 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:07.487 "strip_size_kb": 0, 00:18:07.487 "state": "configuring", 00:18:07.487 "raid_level": "raid1", 00:18:07.487 "superblock": true, 00:18:07.487 "num_base_bdevs": 3, 00:18:07.487 "num_base_bdevs_discovered": 2, 00:18:07.487 "num_base_bdevs_operational": 3, 00:18:07.487 "base_bdevs_list": [ 00:18:07.487 { 00:18:07.487 "name": null, 00:18:07.487 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:07.487 "is_configured": false, 00:18:07.487 "data_offset": 2048, 00:18:07.487 "data_size": 63488 00:18:07.487 }, 00:18:07.487 { 00:18:07.487 "name": "BaseBdev2", 00:18:07.487 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:07.487 "is_configured": true, 00:18:07.487 "data_offset": 2048, 00:18:07.487 "data_size": 63488 00:18:07.487 }, 00:18:07.487 { 00:18:07.487 "name": "BaseBdev3", 00:18:07.487 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:07.487 "is_configured": true, 00:18:07.487 "data_offset": 2048, 00:18:07.487 "data_size": 63488 00:18:07.487 } 00:18:07.487 ] 00:18:07.487 }' 00:18:07.487 07:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.487 07:23:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.054 07:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.054 07:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:08.313 07:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:08.313 07:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.313 07:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:08.313 07:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2 00:18:08.572 [2024-07-25 07:23:41.062990] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:08.572 [2024-07-25 07:23:41.063131] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1abf950 00:18:08.572 [2024-07-25 07:23:41.063152] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:08.572 [2024-07-25 07:23:41.063318] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac8680 00:18:08.572 [2024-07-25 07:23:41.063427] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1abf950 00:18:08.572 [2024-07-25 07:23:41.063437] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1abf950 00:18:08.572 [2024-07-25 07:23:41.063521] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:08.572 NewBaseBdev 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:08.572 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:08.831 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:09.090 [ 00:18:09.090 { 00:18:09.090 "name": "NewBaseBdev", 00:18:09.090 "aliases": [ 00:18:09.090 "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2" 00:18:09.090 ], 00:18:09.090 "product_name": "Malloc disk", 00:18:09.090 "block_size": 512, 00:18:09.090 "num_blocks": 65536, 00:18:09.090 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:09.090 "assigned_rate_limits": { 00:18:09.090 "rw_ios_per_sec": 0, 00:18:09.090 "rw_mbytes_per_sec": 0, 00:18:09.090 "r_mbytes_per_sec": 0, 00:18:09.090 "w_mbytes_per_sec": 0 00:18:09.090 }, 00:18:09.090 "claimed": true, 00:18:09.090 "claim_type": "exclusive_write", 00:18:09.090 "zoned": false, 00:18:09.090 "supported_io_types": { 00:18:09.090 "read": true, 00:18:09.090 "write": true, 00:18:09.090 "unmap": true, 00:18:09.090 "flush": true, 00:18:09.090 "reset": true, 00:18:09.090 "nvme_admin": false, 00:18:09.090 "nvme_io": false, 00:18:09.090 "nvme_io_md": false, 00:18:09.090 "write_zeroes": true, 00:18:09.090 "zcopy": true, 00:18:09.090 "get_zone_info": false, 00:18:09.090 "zone_management": false, 00:18:09.090 "zone_append": false, 00:18:09.090 "compare": false, 00:18:09.090 "compare_and_write": false, 00:18:09.090 "abort": true, 00:18:09.090 "seek_hole": false, 00:18:09.090 "seek_data": false, 00:18:09.090 "copy": true, 00:18:09.090 "nvme_iov_md": false 00:18:09.090 }, 00:18:09.090 "memory_domains": [ 00:18:09.090 { 00:18:09.090 "dma_device_id": "system", 00:18:09.090 "dma_device_type": 1 00:18:09.090 }, 00:18:09.090 { 00:18:09.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.090 "dma_device_type": 2 00:18:09.090 } 00:18:09.090 ], 00:18:09.090 "driver_specific": {} 00:18:09.090 } 00:18:09.090 ] 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.090 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.350 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.350 "name": "Existed_Raid", 00:18:09.350 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:09.350 "strip_size_kb": 0, 00:18:09.350 "state": "online", 00:18:09.350 "raid_level": "raid1", 00:18:09.350 "superblock": true, 00:18:09.350 "num_base_bdevs": 3, 00:18:09.350 "num_base_bdevs_discovered": 3, 00:18:09.350 "num_base_bdevs_operational": 3, 00:18:09.350 "base_bdevs_list": [ 00:18:09.350 { 00:18:09.350 "name": "NewBaseBdev", 00:18:09.350 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:09.350 "is_configured": true, 00:18:09.350 "data_offset": 2048, 00:18:09.350 "data_size": 63488 00:18:09.350 }, 00:18:09.350 { 00:18:09.350 "name": "BaseBdev2", 00:18:09.350 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:09.350 "is_configured": true, 00:18:09.350 "data_offset": 2048, 00:18:09.350 "data_size": 63488 00:18:09.350 }, 00:18:09.350 { 00:18:09.350 "name": "BaseBdev3", 00:18:09.350 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:09.350 "is_configured": true, 00:18:09.350 "data_offset": 2048, 00:18:09.350 "data_size": 63488 00:18:09.350 } 00:18:09.350 ] 00:18:09.350 }' 00:18:09.350 07:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.350 07:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:09.917 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:10.177 [2024-07-25 07:23:42.519120] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:10.177 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:10.177 "name": "Existed_Raid", 00:18:10.177 "aliases": [ 00:18:10.177 "728f2d7c-fb30-4d6a-9403-e88c6fa7441c" 00:18:10.177 ], 00:18:10.177 "product_name": "Raid Volume", 00:18:10.177 "block_size": 512, 00:18:10.177 "num_blocks": 63488, 00:18:10.177 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:10.177 "assigned_rate_limits": { 00:18:10.177 "rw_ios_per_sec": 0, 00:18:10.177 "rw_mbytes_per_sec": 0, 00:18:10.177 "r_mbytes_per_sec": 0, 00:18:10.177 "w_mbytes_per_sec": 0 00:18:10.177 }, 00:18:10.177 "claimed": false, 00:18:10.177 "zoned": false, 00:18:10.177 "supported_io_types": { 00:18:10.177 "read": true, 00:18:10.177 "write": true, 00:18:10.177 "unmap": false, 00:18:10.177 "flush": false, 00:18:10.177 "reset": true, 00:18:10.177 "nvme_admin": false, 00:18:10.177 "nvme_io": false, 00:18:10.177 "nvme_io_md": false, 00:18:10.177 "write_zeroes": true, 00:18:10.177 "zcopy": false, 00:18:10.177 "get_zone_info": false, 00:18:10.177 "zone_management": false, 00:18:10.177 "zone_append": false, 00:18:10.177 "compare": false, 00:18:10.177 "compare_and_write": false, 00:18:10.177 "abort": false, 00:18:10.177 "seek_hole": false, 00:18:10.177 "seek_data": false, 00:18:10.177 "copy": false, 00:18:10.177 "nvme_iov_md": false 00:18:10.177 }, 00:18:10.177 "memory_domains": [ 00:18:10.177 { 00:18:10.177 "dma_device_id": "system", 00:18:10.177 "dma_device_type": 1 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.177 "dma_device_type": 2 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "dma_device_id": "system", 00:18:10.177 "dma_device_type": 1 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.177 "dma_device_type": 2 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "dma_device_id": "system", 00:18:10.177 "dma_device_type": 1 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.177 "dma_device_type": 2 00:18:10.177 } 00:18:10.177 ], 00:18:10.177 "driver_specific": { 00:18:10.177 "raid": { 00:18:10.177 "uuid": "728f2d7c-fb30-4d6a-9403-e88c6fa7441c", 00:18:10.177 "strip_size_kb": 0, 00:18:10.177 "state": "online", 00:18:10.177 "raid_level": "raid1", 00:18:10.177 "superblock": true, 00:18:10.177 "num_base_bdevs": 3, 00:18:10.177 "num_base_bdevs_discovered": 3, 00:18:10.177 "num_base_bdevs_operational": 3, 00:18:10.177 "base_bdevs_list": [ 00:18:10.177 { 00:18:10.177 "name": "NewBaseBdev", 00:18:10.177 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:10.177 "is_configured": true, 00:18:10.177 "data_offset": 2048, 00:18:10.177 "data_size": 63488 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "name": "BaseBdev2", 00:18:10.177 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:10.177 "is_configured": true, 00:18:10.177 "data_offset": 2048, 00:18:10.177 "data_size": 63488 00:18:10.177 }, 00:18:10.177 { 00:18:10.177 "name": "BaseBdev3", 00:18:10.177 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:10.177 "is_configured": true, 00:18:10.177 "data_offset": 2048, 00:18:10.177 "data_size": 63488 00:18:10.177 } 00:18:10.177 ] 00:18:10.177 } 00:18:10.177 } 00:18:10.177 }' 00:18:10.177 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:10.177 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:10.177 BaseBdev2 00:18:10.177 BaseBdev3' 00:18:10.177 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.177 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.177 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.467 "name": "NewBaseBdev", 00:18:10.467 "aliases": [ 00:18:10.467 "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2" 00:18:10.467 ], 00:18:10.467 "product_name": "Malloc disk", 00:18:10.467 "block_size": 512, 00:18:10.467 "num_blocks": 65536, 00:18:10.467 "uuid": "8c99d1ca-16b4-44dc-aae4-0e4f9a6599c2", 00:18:10.467 "assigned_rate_limits": { 00:18:10.467 "rw_ios_per_sec": 0, 00:18:10.467 "rw_mbytes_per_sec": 0, 00:18:10.467 "r_mbytes_per_sec": 0, 00:18:10.467 "w_mbytes_per_sec": 0 00:18:10.467 }, 00:18:10.467 "claimed": true, 00:18:10.467 "claim_type": "exclusive_write", 00:18:10.467 "zoned": false, 00:18:10.467 "supported_io_types": { 00:18:10.467 "read": true, 00:18:10.467 "write": true, 00:18:10.467 "unmap": true, 00:18:10.467 "flush": true, 00:18:10.467 "reset": true, 00:18:10.467 "nvme_admin": false, 00:18:10.467 "nvme_io": false, 00:18:10.467 "nvme_io_md": false, 00:18:10.467 "write_zeroes": true, 00:18:10.467 "zcopy": true, 00:18:10.467 "get_zone_info": false, 00:18:10.467 "zone_management": false, 00:18:10.467 "zone_append": false, 00:18:10.467 "compare": false, 00:18:10.467 "compare_and_write": false, 00:18:10.467 "abort": true, 00:18:10.467 "seek_hole": false, 00:18:10.467 "seek_data": false, 00:18:10.467 "copy": true, 00:18:10.467 "nvme_iov_md": false 00:18:10.467 }, 00:18:10.467 "memory_domains": [ 00:18:10.467 { 00:18:10.467 "dma_device_id": "system", 00:18:10.467 "dma_device_type": 1 00:18:10.467 }, 00:18:10.467 { 00:18:10.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.467 "dma_device_type": 2 00:18:10.467 } 00:18:10.467 ], 00:18:10.467 "driver_specific": {} 00:18:10.467 }' 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.467 07:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.726 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:10.985 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.985 "name": "BaseBdev2", 00:18:10.985 "aliases": [ 00:18:10.985 "0a7849d0-3966-45e7-82e4-4e7dcf26742d" 00:18:10.985 ], 00:18:10.985 "product_name": "Malloc disk", 00:18:10.985 "block_size": 512, 00:18:10.985 "num_blocks": 65536, 00:18:10.985 "uuid": "0a7849d0-3966-45e7-82e4-4e7dcf26742d", 00:18:10.985 "assigned_rate_limits": { 00:18:10.985 "rw_ios_per_sec": 0, 00:18:10.985 "rw_mbytes_per_sec": 0, 00:18:10.985 "r_mbytes_per_sec": 0, 00:18:10.985 "w_mbytes_per_sec": 0 00:18:10.985 }, 00:18:10.985 "claimed": true, 00:18:10.985 "claim_type": "exclusive_write", 00:18:10.985 "zoned": false, 00:18:10.985 "supported_io_types": { 00:18:10.985 "read": true, 00:18:10.985 "write": true, 00:18:10.985 "unmap": true, 00:18:10.985 "flush": true, 00:18:10.985 "reset": true, 00:18:10.985 "nvme_admin": false, 00:18:10.985 "nvme_io": false, 00:18:10.985 "nvme_io_md": false, 00:18:10.985 "write_zeroes": true, 00:18:10.985 "zcopy": true, 00:18:10.985 "get_zone_info": false, 00:18:10.985 "zone_management": false, 00:18:10.985 "zone_append": false, 00:18:10.985 "compare": false, 00:18:10.985 "compare_and_write": false, 00:18:10.985 "abort": true, 00:18:10.985 "seek_hole": false, 00:18:10.985 "seek_data": false, 00:18:10.985 "copy": true, 00:18:10.985 "nvme_iov_md": false 00:18:10.985 }, 00:18:10.985 "memory_domains": [ 00:18:10.985 { 00:18:10.985 "dma_device_id": "system", 00:18:10.985 "dma_device_type": 1 00:18:10.985 }, 00:18:10.985 { 00:18:10.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.985 "dma_device_type": 2 00:18:10.985 } 00:18:10.985 ], 00:18:10.985 "driver_specific": {} 00:18:10.985 }' 00:18:10.985 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.985 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.985 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.985 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.985 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:11.243 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:11.502 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:11.502 "name": "BaseBdev3", 00:18:11.502 "aliases": [ 00:18:11.502 "682e5b77-e952-4b9f-90e6-3bac4ccef922" 00:18:11.502 ], 00:18:11.502 "product_name": "Malloc disk", 00:18:11.502 "block_size": 512, 00:18:11.502 "num_blocks": 65536, 00:18:11.502 "uuid": "682e5b77-e952-4b9f-90e6-3bac4ccef922", 00:18:11.502 "assigned_rate_limits": { 00:18:11.502 "rw_ios_per_sec": 0, 00:18:11.502 "rw_mbytes_per_sec": 0, 00:18:11.502 "r_mbytes_per_sec": 0, 00:18:11.502 "w_mbytes_per_sec": 0 00:18:11.502 }, 00:18:11.502 "claimed": true, 00:18:11.502 "claim_type": "exclusive_write", 00:18:11.502 "zoned": false, 00:18:11.502 "supported_io_types": { 00:18:11.502 "read": true, 00:18:11.502 "write": true, 00:18:11.502 "unmap": true, 00:18:11.502 "flush": true, 00:18:11.502 "reset": true, 00:18:11.502 "nvme_admin": false, 00:18:11.502 "nvme_io": false, 00:18:11.502 "nvme_io_md": false, 00:18:11.502 "write_zeroes": true, 00:18:11.502 "zcopy": true, 00:18:11.502 "get_zone_info": false, 00:18:11.502 "zone_management": false, 00:18:11.502 "zone_append": false, 00:18:11.502 "compare": false, 00:18:11.502 "compare_and_write": false, 00:18:11.502 "abort": true, 00:18:11.502 "seek_hole": false, 00:18:11.502 "seek_data": false, 00:18:11.502 "copy": true, 00:18:11.502 "nvme_iov_md": false 00:18:11.502 }, 00:18:11.502 "memory_domains": [ 00:18:11.502 { 00:18:11.502 "dma_device_id": "system", 00:18:11.502 "dma_device_type": 1 00:18:11.502 }, 00:18:11.502 { 00:18:11.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.502 "dma_device_type": 2 00:18:11.502 } 00:18:11.502 ], 00:18:11.502 "driver_specific": {} 00:18:11.502 }' 00:18:11.502 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:11.502 07:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:11.502 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:11.502 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.760 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:12.019 [2024-07-25 07:23:44.496063] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:12.019 [2024-07-25 07:23:44.496090] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:12.019 [2024-07-25 07:23:44.496149] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:12.019 [2024-07-25 07:23:44.496387] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:12.019 [2024-07-25 07:23:44.496399] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1abf950 name Existed_Raid, state offline 00:18:12.019 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1648398 00:18:12.019 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1648398 ']' 00:18:12.019 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1648398 00:18:12.019 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:18:12.019 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:12.019 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1648398 00:18:12.277 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:12.277 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:12.277 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1648398' 00:18:12.277 killing process with pid 1648398 00:18:12.277 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1648398 00:18:12.277 [2024-07-25 07:23:44.576017] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:12.277 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1648398 00:18:12.277 [2024-07-25 07:23:44.599656] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:12.278 07:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:12.278 00:18:12.278 real 0m27.033s 00:18:12.278 user 0m49.378s 00:18:12.278 sys 0m5.024s 00:18:12.278 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:12.278 07:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:12.278 ************************************ 00:18:12.278 END TEST raid_state_function_test_sb 00:18:12.278 ************************************ 00:18:12.537 07:23:44 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:18:12.537 07:23:44 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:18:12.537 07:23:44 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:12.537 07:23:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:12.537 ************************************ 00:18:12.537 START TEST raid_superblock_test 00:18:12.537 ************************************ 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 3 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1653517 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1653517 /var/tmp/spdk-raid.sock 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1653517 ']' 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:12.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:12.537 07:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.537 [2024-07-25 07:23:44.939678] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:18:12.537 [2024-07-25 07:23:44.939738] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653517 ] 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:12.537 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:12.537 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:12.797 [2024-07-25 07:23:45.074011] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:12.797 [2024-07-25 07:23:45.155356] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:12.797 [2024-07-25 07:23:45.216135] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.797 [2024-07-25 07:23:45.216176] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:13.364 07:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:13.623 malloc1 00:18:13.623 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:13.881 [2024-07-25 07:23:46.226184] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:13.881 [2024-07-25 07:23:46.226233] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.881 [2024-07-25 07:23:46.226253] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd4d280 00:18:13.881 [2024-07-25 07:23:46.226264] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.881 [2024-07-25 07:23:46.227859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.881 [2024-07-25 07:23:46.227887] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:13.881 pt1 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:13.881 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:14.140 malloc2 00:18:14.140 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:14.399 [2024-07-25 07:23:46.683766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:14.399 [2024-07-25 07:23:46.683805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.399 [2024-07-25 07:23:46.683821] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef88c0 00:18:14.399 [2024-07-25 07:23:46.683833] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.399 [2024-07-25 07:23:46.685096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.399 [2024-07-25 07:23:46.685123] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:14.399 pt2 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:14.399 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:14.399 malloc3 00:18:14.658 07:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:14.658 [2024-07-25 07:23:47.141018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:14.658 [2024-07-25 07:23:47.141056] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.658 [2024-07-25 07:23:47.141073] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef8ef0 00:18:14.658 [2024-07-25 07:23:47.141084] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.658 [2024-07-25 07:23:47.142350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.658 [2024-07-25 07:23:47.142376] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:14.658 pt3 00:18:14.658 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:14.658 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:14.658 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:18:14.916 [2024-07-25 07:23:47.373647] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:14.916 [2024-07-25 07:23:47.374730] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:14.916 [2024-07-25 07:23:47.374780] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:14.916 [2024-07-25 07:23:47.374923] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xefc330 00:18:14.916 [2024-07-25 07:23:47.374933] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:14.916 [2024-07-25 07:23:47.375105] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd64170 00:18:14.916 [2024-07-25 07:23:47.375248] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefc330 00:18:14.916 [2024-07-25 07:23:47.375258] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xefc330 00:18:14.916 [2024-07-25 07:23:47.375344] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:14.916 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.175 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.175 "name": "raid_bdev1", 00:18:15.175 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:15.175 "strip_size_kb": 0, 00:18:15.175 "state": "online", 00:18:15.175 "raid_level": "raid1", 00:18:15.175 "superblock": true, 00:18:15.175 "num_base_bdevs": 3, 00:18:15.175 "num_base_bdevs_discovered": 3, 00:18:15.175 "num_base_bdevs_operational": 3, 00:18:15.175 "base_bdevs_list": [ 00:18:15.175 { 00:18:15.175 "name": "pt1", 00:18:15.175 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:15.175 "is_configured": true, 00:18:15.175 "data_offset": 2048, 00:18:15.175 "data_size": 63488 00:18:15.175 }, 00:18:15.175 { 00:18:15.175 "name": "pt2", 00:18:15.175 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:15.175 "is_configured": true, 00:18:15.175 "data_offset": 2048, 00:18:15.175 "data_size": 63488 00:18:15.175 }, 00:18:15.175 { 00:18:15.175 "name": "pt3", 00:18:15.175 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:15.175 "is_configured": true, 00:18:15.175 "data_offset": 2048, 00:18:15.175 "data_size": 63488 00:18:15.175 } 00:18:15.175 ] 00:18:15.175 }' 00:18:15.175 07:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.175 07:23:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:15.743 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:16.001 [2024-07-25 07:23:48.420634] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:16.001 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:16.001 "name": "raid_bdev1", 00:18:16.001 "aliases": [ 00:18:16.001 "c899d34e-bfc2-489b-b477-6f65068bd2c0" 00:18:16.001 ], 00:18:16.001 "product_name": "Raid Volume", 00:18:16.001 "block_size": 512, 00:18:16.001 "num_blocks": 63488, 00:18:16.001 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:16.001 "assigned_rate_limits": { 00:18:16.001 "rw_ios_per_sec": 0, 00:18:16.001 "rw_mbytes_per_sec": 0, 00:18:16.001 "r_mbytes_per_sec": 0, 00:18:16.001 "w_mbytes_per_sec": 0 00:18:16.001 }, 00:18:16.002 "claimed": false, 00:18:16.002 "zoned": false, 00:18:16.002 "supported_io_types": { 00:18:16.002 "read": true, 00:18:16.002 "write": true, 00:18:16.002 "unmap": false, 00:18:16.002 "flush": false, 00:18:16.002 "reset": true, 00:18:16.002 "nvme_admin": false, 00:18:16.002 "nvme_io": false, 00:18:16.002 "nvme_io_md": false, 00:18:16.002 "write_zeroes": true, 00:18:16.002 "zcopy": false, 00:18:16.002 "get_zone_info": false, 00:18:16.002 "zone_management": false, 00:18:16.002 "zone_append": false, 00:18:16.002 "compare": false, 00:18:16.002 "compare_and_write": false, 00:18:16.002 "abort": false, 00:18:16.002 "seek_hole": false, 00:18:16.002 "seek_data": false, 00:18:16.002 "copy": false, 00:18:16.002 "nvme_iov_md": false 00:18:16.002 }, 00:18:16.002 "memory_domains": [ 00:18:16.002 { 00:18:16.002 "dma_device_id": "system", 00:18:16.002 "dma_device_type": 1 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.002 "dma_device_type": 2 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "dma_device_id": "system", 00:18:16.002 "dma_device_type": 1 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.002 "dma_device_type": 2 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "dma_device_id": "system", 00:18:16.002 "dma_device_type": 1 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.002 "dma_device_type": 2 00:18:16.002 } 00:18:16.002 ], 00:18:16.002 "driver_specific": { 00:18:16.002 "raid": { 00:18:16.002 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:16.002 "strip_size_kb": 0, 00:18:16.002 "state": "online", 00:18:16.002 "raid_level": "raid1", 00:18:16.002 "superblock": true, 00:18:16.002 "num_base_bdevs": 3, 00:18:16.002 "num_base_bdevs_discovered": 3, 00:18:16.002 "num_base_bdevs_operational": 3, 00:18:16.002 "base_bdevs_list": [ 00:18:16.002 { 00:18:16.002 "name": "pt1", 00:18:16.002 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:16.002 "is_configured": true, 00:18:16.002 "data_offset": 2048, 00:18:16.002 "data_size": 63488 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "name": "pt2", 00:18:16.002 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:16.002 "is_configured": true, 00:18:16.002 "data_offset": 2048, 00:18:16.002 "data_size": 63488 00:18:16.002 }, 00:18:16.002 { 00:18:16.002 "name": "pt3", 00:18:16.002 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:16.002 "is_configured": true, 00:18:16.002 "data_offset": 2048, 00:18:16.002 "data_size": 63488 00:18:16.002 } 00:18:16.002 ] 00:18:16.002 } 00:18:16.002 } 00:18:16.002 }' 00:18:16.002 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:16.002 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:16.002 pt2 00:18:16.002 pt3' 00:18:16.002 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.002 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:16.002 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.260 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.260 "name": "pt1", 00:18:16.260 "aliases": [ 00:18:16.260 "00000000-0000-0000-0000-000000000001" 00:18:16.260 ], 00:18:16.260 "product_name": "passthru", 00:18:16.260 "block_size": 512, 00:18:16.260 "num_blocks": 65536, 00:18:16.260 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:16.260 "assigned_rate_limits": { 00:18:16.260 "rw_ios_per_sec": 0, 00:18:16.260 "rw_mbytes_per_sec": 0, 00:18:16.260 "r_mbytes_per_sec": 0, 00:18:16.260 "w_mbytes_per_sec": 0 00:18:16.260 }, 00:18:16.260 "claimed": true, 00:18:16.260 "claim_type": "exclusive_write", 00:18:16.260 "zoned": false, 00:18:16.260 "supported_io_types": { 00:18:16.260 "read": true, 00:18:16.260 "write": true, 00:18:16.260 "unmap": true, 00:18:16.260 "flush": true, 00:18:16.260 "reset": true, 00:18:16.260 "nvme_admin": false, 00:18:16.260 "nvme_io": false, 00:18:16.260 "nvme_io_md": false, 00:18:16.260 "write_zeroes": true, 00:18:16.260 "zcopy": true, 00:18:16.260 "get_zone_info": false, 00:18:16.260 "zone_management": false, 00:18:16.260 "zone_append": false, 00:18:16.260 "compare": false, 00:18:16.260 "compare_and_write": false, 00:18:16.260 "abort": true, 00:18:16.260 "seek_hole": false, 00:18:16.260 "seek_data": false, 00:18:16.260 "copy": true, 00:18:16.260 "nvme_iov_md": false 00:18:16.260 }, 00:18:16.260 "memory_domains": [ 00:18:16.260 { 00:18:16.260 "dma_device_id": "system", 00:18:16.260 "dma_device_type": 1 00:18:16.260 }, 00:18:16.260 { 00:18:16.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.260 "dma_device_type": 2 00:18:16.260 } 00:18:16.260 ], 00:18:16.260 "driver_specific": { 00:18:16.260 "passthru": { 00:18:16.260 "name": "pt1", 00:18:16.260 "base_bdev_name": "malloc1" 00:18:16.260 } 00:18:16.260 } 00:18:16.260 }' 00:18:16.260 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.260 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.519 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.519 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.519 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.519 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.519 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.519 07:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.519 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.519 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.778 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.778 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.778 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.778 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:16.778 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.346 "name": "pt2", 00:18:17.346 "aliases": [ 00:18:17.346 "00000000-0000-0000-0000-000000000002" 00:18:17.346 ], 00:18:17.346 "product_name": "passthru", 00:18:17.346 "block_size": 512, 00:18:17.346 "num_blocks": 65536, 00:18:17.346 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:17.346 "assigned_rate_limits": { 00:18:17.346 "rw_ios_per_sec": 0, 00:18:17.346 "rw_mbytes_per_sec": 0, 00:18:17.346 "r_mbytes_per_sec": 0, 00:18:17.346 "w_mbytes_per_sec": 0 00:18:17.346 }, 00:18:17.346 "claimed": true, 00:18:17.346 "claim_type": "exclusive_write", 00:18:17.346 "zoned": false, 00:18:17.346 "supported_io_types": { 00:18:17.346 "read": true, 00:18:17.346 "write": true, 00:18:17.346 "unmap": true, 00:18:17.346 "flush": true, 00:18:17.346 "reset": true, 00:18:17.346 "nvme_admin": false, 00:18:17.346 "nvme_io": false, 00:18:17.346 "nvme_io_md": false, 00:18:17.346 "write_zeroes": true, 00:18:17.346 "zcopy": true, 00:18:17.346 "get_zone_info": false, 00:18:17.346 "zone_management": false, 00:18:17.346 "zone_append": false, 00:18:17.346 "compare": false, 00:18:17.346 "compare_and_write": false, 00:18:17.346 "abort": true, 00:18:17.346 "seek_hole": false, 00:18:17.346 "seek_data": false, 00:18:17.346 "copy": true, 00:18:17.346 "nvme_iov_md": false 00:18:17.346 }, 00:18:17.346 "memory_domains": [ 00:18:17.346 { 00:18:17.346 "dma_device_id": "system", 00:18:17.346 "dma_device_type": 1 00:18:17.346 }, 00:18:17.346 { 00:18:17.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.346 "dma_device_type": 2 00:18:17.346 } 00:18:17.346 ], 00:18:17.346 "driver_specific": { 00:18:17.346 "passthru": { 00:18:17.346 "name": "pt2", 00:18:17.346 "base_bdev_name": "malloc2" 00:18:17.346 } 00:18:17.346 } 00:18:17.346 }' 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.346 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:17.605 07:23:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.862 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.862 "name": "pt3", 00:18:17.862 "aliases": [ 00:18:17.862 "00000000-0000-0000-0000-000000000003" 00:18:17.862 ], 00:18:17.862 "product_name": "passthru", 00:18:17.862 "block_size": 512, 00:18:17.862 "num_blocks": 65536, 00:18:17.862 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:17.862 "assigned_rate_limits": { 00:18:17.862 "rw_ios_per_sec": 0, 00:18:17.862 "rw_mbytes_per_sec": 0, 00:18:17.862 "r_mbytes_per_sec": 0, 00:18:17.862 "w_mbytes_per_sec": 0 00:18:17.862 }, 00:18:17.862 "claimed": true, 00:18:17.862 "claim_type": "exclusive_write", 00:18:17.862 "zoned": false, 00:18:17.862 "supported_io_types": { 00:18:17.862 "read": true, 00:18:17.862 "write": true, 00:18:17.862 "unmap": true, 00:18:17.862 "flush": true, 00:18:17.862 "reset": true, 00:18:17.862 "nvme_admin": false, 00:18:17.862 "nvme_io": false, 00:18:17.862 "nvme_io_md": false, 00:18:17.862 "write_zeroes": true, 00:18:17.862 "zcopy": true, 00:18:17.862 "get_zone_info": false, 00:18:17.862 "zone_management": false, 00:18:17.862 "zone_append": false, 00:18:17.862 "compare": false, 00:18:17.862 "compare_and_write": false, 00:18:17.862 "abort": true, 00:18:17.862 "seek_hole": false, 00:18:17.862 "seek_data": false, 00:18:17.862 "copy": true, 00:18:17.862 "nvme_iov_md": false 00:18:17.862 }, 00:18:17.862 "memory_domains": [ 00:18:17.862 { 00:18:17.862 "dma_device_id": "system", 00:18:17.862 "dma_device_type": 1 00:18:17.862 }, 00:18:17.862 { 00:18:17.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.862 "dma_device_type": 2 00:18:17.862 } 00:18:17.862 ], 00:18:17.862 "driver_specific": { 00:18:17.862 "passthru": { 00:18:17.862 "name": "pt3", 00:18:17.862 "base_bdev_name": "malloc3" 00:18:17.862 } 00:18:17.862 } 00:18:17.862 }' 00:18:17.862 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.862 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.862 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.863 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.863 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.863 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.863 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:18.121 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:18:18.380 [2024-07-25 07:23:50.750752] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:18.380 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=c899d34e-bfc2-489b-b477-6f65068bd2c0 00:18:18.380 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z c899d34e-bfc2-489b-b477-6f65068bd2c0 ']' 00:18:18.380 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:18.639 [2024-07-25 07:23:50.914928] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:18.639 [2024-07-25 07:23:50.914947] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.639 [2024-07-25 07:23:50.914999] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.639 [2024-07-25 07:23:50.915061] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.639 [2024-07-25 07:23:50.915072] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefc330 name raid_bdev1, state offline 00:18:18.639 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.639 07:23:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:18:18.639 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:18:18.639 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:18:18.639 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:18.639 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:18.898 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:18.898 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:19.157 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:19.157 07:23:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:19.725 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:19.725 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:19.985 [2024-07-25 07:23:52.486999] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:19.985 [2024-07-25 07:23:52.488277] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:19.985 [2024-07-25 07:23:52.488320] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:19.985 [2024-07-25 07:23:52.488363] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:19.985 [2024-07-25 07:23:52.488402] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:19.985 [2024-07-25 07:23:52.488424] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:19.985 [2024-07-25 07:23:52.488440] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:19.985 [2024-07-25 07:23:52.488450] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xef7e80 name raid_bdev1, state configuring 00:18:19.985 request: 00:18:19.985 { 00:18:19.985 "name": "raid_bdev1", 00:18:19.985 "raid_level": "raid1", 00:18:19.985 "base_bdevs": [ 00:18:19.985 "malloc1", 00:18:19.985 "malloc2", 00:18:19.985 "malloc3" 00:18:19.985 ], 00:18:19.985 "superblock": false, 00:18:19.985 "method": "bdev_raid_create", 00:18:19.985 "req_id": 1 00:18:19.985 } 00:18:19.985 Got JSON-RPC error response 00:18:19.985 response: 00:18:19.985 { 00:18:19.985 "code": -17, 00:18:19.985 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:19.985 } 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.985 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:18:20.244 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:18:20.244 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:18:20.244 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:20.504 [2024-07-25 07:23:52.932111] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:20.504 [2024-07-25 07:23:52.932161] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.504 [2024-07-25 07:23:52.932179] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef6490 00:18:20.504 [2024-07-25 07:23:52.932191] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.504 [2024-07-25 07:23:52.933683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.504 [2024-07-25 07:23:52.933712] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:20.504 [2024-07-25 07:23:52.933775] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:20.504 [2024-07-25 07:23:52.933801] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:20.504 pt1 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.504 07:23:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.763 07:23:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.763 "name": "raid_bdev1", 00:18:20.763 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:20.763 "strip_size_kb": 0, 00:18:20.763 "state": "configuring", 00:18:20.763 "raid_level": "raid1", 00:18:20.763 "superblock": true, 00:18:20.763 "num_base_bdevs": 3, 00:18:20.763 "num_base_bdevs_discovered": 1, 00:18:20.763 "num_base_bdevs_operational": 3, 00:18:20.763 "base_bdevs_list": [ 00:18:20.763 { 00:18:20.763 "name": "pt1", 00:18:20.763 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:20.763 "is_configured": true, 00:18:20.763 "data_offset": 2048, 00:18:20.763 "data_size": 63488 00:18:20.763 }, 00:18:20.763 { 00:18:20.763 "name": null, 00:18:20.763 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:20.763 "is_configured": false, 00:18:20.763 "data_offset": 2048, 00:18:20.763 "data_size": 63488 00:18:20.763 }, 00:18:20.763 { 00:18:20.763 "name": null, 00:18:20.763 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:20.763 "is_configured": false, 00:18:20.763 "data_offset": 2048, 00:18:20.763 "data_size": 63488 00:18:20.763 } 00:18:20.763 ] 00:18:20.763 }' 00:18:20.763 07:23:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.763 07:23:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:21.332 07:23:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:18:21.332 07:23:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:21.591 [2024-07-25 07:23:53.990920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:21.591 [2024-07-25 07:23:53.990966] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.591 [2024-07-25 07:23:53.990988] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd4de70 00:18:21.591 [2024-07-25 07:23:53.991000] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.591 [2024-07-25 07:23:53.991328] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.591 [2024-07-25 07:23:53.991344] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:21.591 [2024-07-25 07:23:53.991402] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:21.591 [2024-07-25 07:23:53.991420] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:21.591 pt2 00:18:21.591 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:21.850 [2024-07-25 07:23:54.215534] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:21.850 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.109 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.109 "name": "raid_bdev1", 00:18:22.109 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:22.109 "strip_size_kb": 0, 00:18:22.109 "state": "configuring", 00:18:22.109 "raid_level": "raid1", 00:18:22.109 "superblock": true, 00:18:22.109 "num_base_bdevs": 3, 00:18:22.109 "num_base_bdevs_discovered": 1, 00:18:22.109 "num_base_bdevs_operational": 3, 00:18:22.109 "base_bdevs_list": [ 00:18:22.109 { 00:18:22.109 "name": "pt1", 00:18:22.109 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:22.109 "is_configured": true, 00:18:22.109 "data_offset": 2048, 00:18:22.109 "data_size": 63488 00:18:22.109 }, 00:18:22.109 { 00:18:22.109 "name": null, 00:18:22.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:22.109 "is_configured": false, 00:18:22.109 "data_offset": 2048, 00:18:22.109 "data_size": 63488 00:18:22.109 }, 00:18:22.109 { 00:18:22.109 "name": null, 00:18:22.110 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:22.110 "is_configured": false, 00:18:22.110 "data_offset": 2048, 00:18:22.110 "data_size": 63488 00:18:22.110 } 00:18:22.110 ] 00:18:22.110 }' 00:18:22.110 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.110 07:23:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.677 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:18:22.677 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:22.677 07:23:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:22.677 [2024-07-25 07:23:55.166024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:22.677 [2024-07-25 07:23:55.166074] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.677 [2024-07-25 07:23:55.166092] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef8af0 00:18:22.677 [2024-07-25 07:23:55.166103] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.677 [2024-07-25 07:23:55.166439] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.677 [2024-07-25 07:23:55.166456] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:22.677 [2024-07-25 07:23:55.166515] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:22.677 [2024-07-25 07:23:55.166534] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:22.677 pt2 00:18:22.677 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:22.677 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:22.677 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:22.937 [2024-07-25 07:23:55.398632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:22.937 [2024-07-25 07:23:55.398671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.937 [2024-07-25 07:23:55.398687] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xefa550 00:18:22.937 [2024-07-25 07:23:55.398698] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.937 [2024-07-25 07:23:55.398987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.937 [2024-07-25 07:23:55.399003] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:22.937 [2024-07-25 07:23:55.399056] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:22.937 [2024-07-25 07:23:55.399073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:22.937 [2024-07-25 07:23:55.399195] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xefe6c0 00:18:22.937 [2024-07-25 07:23:55.399206] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:22.937 [2024-07-25 07:23:55.399362] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xef9800 00:18:22.937 [2024-07-25 07:23:55.399486] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefe6c0 00:18:22.937 [2024-07-25 07:23:55.399500] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xefe6c0 00:18:22.937 [2024-07-25 07:23:55.399591] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.937 pt3 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.937 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:23.228 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:23.228 "name": "raid_bdev1", 00:18:23.228 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:23.228 "strip_size_kb": 0, 00:18:23.228 "state": "online", 00:18:23.228 "raid_level": "raid1", 00:18:23.228 "superblock": true, 00:18:23.228 "num_base_bdevs": 3, 00:18:23.228 "num_base_bdevs_discovered": 3, 00:18:23.228 "num_base_bdevs_operational": 3, 00:18:23.228 "base_bdevs_list": [ 00:18:23.228 { 00:18:23.228 "name": "pt1", 00:18:23.228 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:23.228 "is_configured": true, 00:18:23.228 "data_offset": 2048, 00:18:23.228 "data_size": 63488 00:18:23.228 }, 00:18:23.228 { 00:18:23.228 "name": "pt2", 00:18:23.228 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:23.228 "is_configured": true, 00:18:23.228 "data_offset": 2048, 00:18:23.228 "data_size": 63488 00:18:23.228 }, 00:18:23.228 { 00:18:23.228 "name": "pt3", 00:18:23.228 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:23.228 "is_configured": true, 00:18:23.228 "data_offset": 2048, 00:18:23.228 "data_size": 63488 00:18:23.228 } 00:18:23.228 ] 00:18:23.228 }' 00:18:23.228 07:23:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:23.228 07:23:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:23.796 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:24.056 [2024-07-25 07:23:56.441630] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:24.056 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:24.056 "name": "raid_bdev1", 00:18:24.056 "aliases": [ 00:18:24.056 "c899d34e-bfc2-489b-b477-6f65068bd2c0" 00:18:24.056 ], 00:18:24.056 "product_name": "Raid Volume", 00:18:24.056 "block_size": 512, 00:18:24.056 "num_blocks": 63488, 00:18:24.056 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:24.056 "assigned_rate_limits": { 00:18:24.056 "rw_ios_per_sec": 0, 00:18:24.056 "rw_mbytes_per_sec": 0, 00:18:24.056 "r_mbytes_per_sec": 0, 00:18:24.056 "w_mbytes_per_sec": 0 00:18:24.056 }, 00:18:24.056 "claimed": false, 00:18:24.056 "zoned": false, 00:18:24.056 "supported_io_types": { 00:18:24.056 "read": true, 00:18:24.056 "write": true, 00:18:24.056 "unmap": false, 00:18:24.056 "flush": false, 00:18:24.056 "reset": true, 00:18:24.056 "nvme_admin": false, 00:18:24.056 "nvme_io": false, 00:18:24.056 "nvme_io_md": false, 00:18:24.056 "write_zeroes": true, 00:18:24.056 "zcopy": false, 00:18:24.056 "get_zone_info": false, 00:18:24.056 "zone_management": false, 00:18:24.056 "zone_append": false, 00:18:24.056 "compare": false, 00:18:24.056 "compare_and_write": false, 00:18:24.056 "abort": false, 00:18:24.056 "seek_hole": false, 00:18:24.056 "seek_data": false, 00:18:24.056 "copy": false, 00:18:24.056 "nvme_iov_md": false 00:18:24.056 }, 00:18:24.056 "memory_domains": [ 00:18:24.056 { 00:18:24.056 "dma_device_id": "system", 00:18:24.056 "dma_device_type": 1 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.056 "dma_device_type": 2 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "dma_device_id": "system", 00:18:24.056 "dma_device_type": 1 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.056 "dma_device_type": 2 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "dma_device_id": "system", 00:18:24.056 "dma_device_type": 1 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.056 "dma_device_type": 2 00:18:24.056 } 00:18:24.056 ], 00:18:24.056 "driver_specific": { 00:18:24.056 "raid": { 00:18:24.056 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:24.056 "strip_size_kb": 0, 00:18:24.056 "state": "online", 00:18:24.056 "raid_level": "raid1", 00:18:24.056 "superblock": true, 00:18:24.056 "num_base_bdevs": 3, 00:18:24.056 "num_base_bdevs_discovered": 3, 00:18:24.056 "num_base_bdevs_operational": 3, 00:18:24.056 "base_bdevs_list": [ 00:18:24.056 { 00:18:24.056 "name": "pt1", 00:18:24.056 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:24.056 "is_configured": true, 00:18:24.056 "data_offset": 2048, 00:18:24.056 "data_size": 63488 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "name": "pt2", 00:18:24.056 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:24.056 "is_configured": true, 00:18:24.056 "data_offset": 2048, 00:18:24.056 "data_size": 63488 00:18:24.056 }, 00:18:24.056 { 00:18:24.056 "name": "pt3", 00:18:24.056 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:24.056 "is_configured": true, 00:18:24.056 "data_offset": 2048, 00:18:24.056 "data_size": 63488 00:18:24.056 } 00:18:24.056 ] 00:18:24.056 } 00:18:24.056 } 00:18:24.056 }' 00:18:24.056 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:24.056 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:24.056 pt2 00:18:24.056 pt3' 00:18:24.056 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:24.056 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:24.056 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:24.316 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:24.316 "name": "pt1", 00:18:24.316 "aliases": [ 00:18:24.316 "00000000-0000-0000-0000-000000000001" 00:18:24.316 ], 00:18:24.316 "product_name": "passthru", 00:18:24.316 "block_size": 512, 00:18:24.316 "num_blocks": 65536, 00:18:24.316 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:24.316 "assigned_rate_limits": { 00:18:24.316 "rw_ios_per_sec": 0, 00:18:24.316 "rw_mbytes_per_sec": 0, 00:18:24.316 "r_mbytes_per_sec": 0, 00:18:24.316 "w_mbytes_per_sec": 0 00:18:24.316 }, 00:18:24.316 "claimed": true, 00:18:24.316 "claim_type": "exclusive_write", 00:18:24.316 "zoned": false, 00:18:24.316 "supported_io_types": { 00:18:24.316 "read": true, 00:18:24.316 "write": true, 00:18:24.316 "unmap": true, 00:18:24.316 "flush": true, 00:18:24.316 "reset": true, 00:18:24.316 "nvme_admin": false, 00:18:24.316 "nvme_io": false, 00:18:24.316 "nvme_io_md": false, 00:18:24.316 "write_zeroes": true, 00:18:24.316 "zcopy": true, 00:18:24.316 "get_zone_info": false, 00:18:24.316 "zone_management": false, 00:18:24.316 "zone_append": false, 00:18:24.316 "compare": false, 00:18:24.316 "compare_and_write": false, 00:18:24.316 "abort": true, 00:18:24.316 "seek_hole": false, 00:18:24.316 "seek_data": false, 00:18:24.316 "copy": true, 00:18:24.316 "nvme_iov_md": false 00:18:24.316 }, 00:18:24.316 "memory_domains": [ 00:18:24.316 { 00:18:24.316 "dma_device_id": "system", 00:18:24.316 "dma_device_type": 1 00:18:24.316 }, 00:18:24.316 { 00:18:24.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.316 "dma_device_type": 2 00:18:24.316 } 00:18:24.316 ], 00:18:24.316 "driver_specific": { 00:18:24.316 "passthru": { 00:18:24.316 "name": "pt1", 00:18:24.316 "base_bdev_name": "malloc1" 00:18:24.316 } 00:18:24.316 } 00:18:24.316 }' 00:18:24.316 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.316 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.316 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.316 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.316 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.575 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.575 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.575 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.575 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:24.575 07:23:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.575 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.575 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:24.575 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:24.575 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:24.575 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:24.834 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:24.834 "name": "pt2", 00:18:24.834 "aliases": [ 00:18:24.834 "00000000-0000-0000-0000-000000000002" 00:18:24.834 ], 00:18:24.834 "product_name": "passthru", 00:18:24.834 "block_size": 512, 00:18:24.834 "num_blocks": 65536, 00:18:24.834 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:24.834 "assigned_rate_limits": { 00:18:24.834 "rw_ios_per_sec": 0, 00:18:24.834 "rw_mbytes_per_sec": 0, 00:18:24.834 "r_mbytes_per_sec": 0, 00:18:24.834 "w_mbytes_per_sec": 0 00:18:24.834 }, 00:18:24.834 "claimed": true, 00:18:24.834 "claim_type": "exclusive_write", 00:18:24.834 "zoned": false, 00:18:24.834 "supported_io_types": { 00:18:24.834 "read": true, 00:18:24.834 "write": true, 00:18:24.834 "unmap": true, 00:18:24.834 "flush": true, 00:18:24.834 "reset": true, 00:18:24.834 "nvme_admin": false, 00:18:24.834 "nvme_io": false, 00:18:24.834 "nvme_io_md": false, 00:18:24.834 "write_zeroes": true, 00:18:24.834 "zcopy": true, 00:18:24.834 "get_zone_info": false, 00:18:24.834 "zone_management": false, 00:18:24.834 "zone_append": false, 00:18:24.834 "compare": false, 00:18:24.834 "compare_and_write": false, 00:18:24.834 "abort": true, 00:18:24.834 "seek_hole": false, 00:18:24.835 "seek_data": false, 00:18:24.835 "copy": true, 00:18:24.835 "nvme_iov_md": false 00:18:24.835 }, 00:18:24.835 "memory_domains": [ 00:18:24.835 { 00:18:24.835 "dma_device_id": "system", 00:18:24.835 "dma_device_type": 1 00:18:24.835 }, 00:18:24.835 { 00:18:24.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.835 "dma_device_type": 2 00:18:24.835 } 00:18:24.835 ], 00:18:24.835 "driver_specific": { 00:18:24.835 "passthru": { 00:18:24.835 "name": "pt2", 00:18:24.835 "base_bdev_name": "malloc2" 00:18:24.835 } 00:18:24.835 } 00:18:24.835 }' 00:18:24.835 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.835 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.835 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.835 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:25.094 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:25.354 "name": "pt3", 00:18:25.354 "aliases": [ 00:18:25.354 "00000000-0000-0000-0000-000000000003" 00:18:25.354 ], 00:18:25.354 "product_name": "passthru", 00:18:25.354 "block_size": 512, 00:18:25.354 "num_blocks": 65536, 00:18:25.354 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:25.354 "assigned_rate_limits": { 00:18:25.354 "rw_ios_per_sec": 0, 00:18:25.354 "rw_mbytes_per_sec": 0, 00:18:25.354 "r_mbytes_per_sec": 0, 00:18:25.354 "w_mbytes_per_sec": 0 00:18:25.354 }, 00:18:25.354 "claimed": true, 00:18:25.354 "claim_type": "exclusive_write", 00:18:25.354 "zoned": false, 00:18:25.354 "supported_io_types": { 00:18:25.354 "read": true, 00:18:25.354 "write": true, 00:18:25.354 "unmap": true, 00:18:25.354 "flush": true, 00:18:25.354 "reset": true, 00:18:25.354 "nvme_admin": false, 00:18:25.354 "nvme_io": false, 00:18:25.354 "nvme_io_md": false, 00:18:25.354 "write_zeroes": true, 00:18:25.354 "zcopy": true, 00:18:25.354 "get_zone_info": false, 00:18:25.354 "zone_management": false, 00:18:25.354 "zone_append": false, 00:18:25.354 "compare": false, 00:18:25.354 "compare_and_write": false, 00:18:25.354 "abort": true, 00:18:25.354 "seek_hole": false, 00:18:25.354 "seek_data": false, 00:18:25.354 "copy": true, 00:18:25.354 "nvme_iov_md": false 00:18:25.354 }, 00:18:25.354 "memory_domains": [ 00:18:25.354 { 00:18:25.354 "dma_device_id": "system", 00:18:25.354 "dma_device_type": 1 00:18:25.354 }, 00:18:25.354 { 00:18:25.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.354 "dma_device_type": 2 00:18:25.354 } 00:18:25.354 ], 00:18:25.354 "driver_specific": { 00:18:25.354 "passthru": { 00:18:25.354 "name": "pt3", 00:18:25.354 "base_bdev_name": "malloc3" 00:18:25.354 } 00:18:25.354 } 00:18:25.354 }' 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.354 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:25.613 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.613 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.613 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:25.613 07:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.613 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.613 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:25.613 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:18:25.613 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:25.872 [2024-07-25 07:23:58.266433] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:25.872 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' c899d34e-bfc2-489b-b477-6f65068bd2c0 '!=' c899d34e-bfc2-489b-b477-6f65068bd2c0 ']' 00:18:25.872 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:18:25.872 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:25.872 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:25.872 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:26.131 [2024-07-25 07:23:58.498806] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.131 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:26.390 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.390 "name": "raid_bdev1", 00:18:26.390 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:26.390 "strip_size_kb": 0, 00:18:26.390 "state": "online", 00:18:26.390 "raid_level": "raid1", 00:18:26.390 "superblock": true, 00:18:26.390 "num_base_bdevs": 3, 00:18:26.390 "num_base_bdevs_discovered": 2, 00:18:26.390 "num_base_bdevs_operational": 2, 00:18:26.390 "base_bdevs_list": [ 00:18:26.390 { 00:18:26.390 "name": null, 00:18:26.390 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.390 "is_configured": false, 00:18:26.390 "data_offset": 2048, 00:18:26.390 "data_size": 63488 00:18:26.390 }, 00:18:26.390 { 00:18:26.390 "name": "pt2", 00:18:26.390 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:26.390 "is_configured": true, 00:18:26.390 "data_offset": 2048, 00:18:26.390 "data_size": 63488 00:18:26.390 }, 00:18:26.390 { 00:18:26.390 "name": "pt3", 00:18:26.390 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:26.390 "is_configured": true, 00:18:26.390 "data_offset": 2048, 00:18:26.390 "data_size": 63488 00:18:26.390 } 00:18:26.390 ] 00:18:26.390 }' 00:18:26.390 07:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.390 07:23:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.958 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:26.958 [2024-07-25 07:23:59.409182] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:26.958 [2024-07-25 07:23:59.409210] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:26.958 [2024-07-25 07:23:59.409263] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:26.958 [2024-07-25 07:23:59.409315] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:26.958 [2024-07-25 07:23:59.409326] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefe6c0 name raid_bdev1, state offline 00:18:26.958 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:18:26.958 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.218 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:18:27.218 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:18:27.218 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:18:27.218 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:18:27.218 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:27.478 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:18:27.478 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:18:27.478 07:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:27.737 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:18:27.737 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:18:27.737 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:18:27.737 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:18:27.737 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:27.996 [2024-07-25 07:24:00.319532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:27.996 [2024-07-25 07:24:00.319581] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.996 [2024-07-25 07:24:00.319598] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xefa780 00:18:27.996 [2024-07-25 07:24:00.319610] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.996 [2024-07-25 07:24:00.321117] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.996 [2024-07-25 07:24:00.321156] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:27.996 [2024-07-25 07:24:00.321220] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:27.996 [2024-07-25 07:24:00.321245] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:27.996 pt2 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.996 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.997 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.997 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:28.256 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.256 "name": "raid_bdev1", 00:18:28.256 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:28.256 "strip_size_kb": 0, 00:18:28.256 "state": "configuring", 00:18:28.256 "raid_level": "raid1", 00:18:28.256 "superblock": true, 00:18:28.256 "num_base_bdevs": 3, 00:18:28.256 "num_base_bdevs_discovered": 1, 00:18:28.256 "num_base_bdevs_operational": 2, 00:18:28.256 "base_bdevs_list": [ 00:18:28.256 { 00:18:28.256 "name": null, 00:18:28.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.256 "is_configured": false, 00:18:28.256 "data_offset": 2048, 00:18:28.256 "data_size": 63488 00:18:28.256 }, 00:18:28.256 { 00:18:28.256 "name": "pt2", 00:18:28.256 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:28.256 "is_configured": true, 00:18:28.256 "data_offset": 2048, 00:18:28.256 "data_size": 63488 00:18:28.256 }, 00:18:28.256 { 00:18:28.256 "name": null, 00:18:28.256 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:28.256 "is_configured": false, 00:18:28.256 "data_offset": 2048, 00:18:28.256 "data_size": 63488 00:18:28.256 } 00:18:28.256 ] 00:18:28.256 }' 00:18:28.256 07:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.256 07:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=2 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:28.831 [2024-07-25 07:24:01.298129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:28.831 [2024-07-25 07:24:01.298202] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.831 [2024-07-25 07:24:01.298231] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xefb040 00:18:28.831 [2024-07-25 07:24:01.298251] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.831 [2024-07-25 07:24:01.298576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.831 [2024-07-25 07:24:01.298593] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:28.831 [2024-07-25 07:24:01.298655] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:28.831 [2024-07-25 07:24:01.298674] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:28.831 [2024-07-25 07:24:01.298766] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xefb2b0 00:18:28.831 [2024-07-25 07:24:01.298776] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:28.831 [2024-07-25 07:24:01.298929] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xef9800 00:18:28.831 [2024-07-25 07:24:01.299045] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefb2b0 00:18:28.831 [2024-07-25 07:24:01.299054] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xefb2b0 00:18:28.831 [2024-07-25 07:24:01.299155] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:28.831 pt3 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.831 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.832 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.832 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.832 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.832 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.090 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.090 "name": "raid_bdev1", 00:18:29.090 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:29.090 "strip_size_kb": 0, 00:18:29.090 "state": "online", 00:18:29.090 "raid_level": "raid1", 00:18:29.090 "superblock": true, 00:18:29.090 "num_base_bdevs": 3, 00:18:29.090 "num_base_bdevs_discovered": 2, 00:18:29.090 "num_base_bdevs_operational": 2, 00:18:29.090 "base_bdevs_list": [ 00:18:29.090 { 00:18:29.090 "name": null, 00:18:29.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.090 "is_configured": false, 00:18:29.090 "data_offset": 2048, 00:18:29.090 "data_size": 63488 00:18:29.090 }, 00:18:29.090 { 00:18:29.090 "name": "pt2", 00:18:29.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:29.090 "is_configured": true, 00:18:29.090 "data_offset": 2048, 00:18:29.090 "data_size": 63488 00:18:29.090 }, 00:18:29.090 { 00:18:29.090 "name": "pt3", 00:18:29.090 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:29.090 "is_configured": true, 00:18:29.090 "data_offset": 2048, 00:18:29.090 "data_size": 63488 00:18:29.090 } 00:18:29.090 ] 00:18:29.090 }' 00:18:29.090 07:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.090 07:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.659 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:29.919 [2024-07-25 07:24:02.340866] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:29.919 [2024-07-25 07:24:02.340894] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:29.919 [2024-07-25 07:24:02.340948] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:29.919 [2024-07-25 07:24:02.340997] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:29.919 [2024-07-25 07:24:02.341012] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefb2b0 name raid_bdev1, state offline 00:18:29.919 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.919 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:18:30.177 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:18:30.177 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:18:30.177 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 3 -gt 2 ']' 00:18:30.178 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=2 00:18:30.178 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:30.437 07:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:30.696 [2024-07-25 07:24:03.038672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:30.696 [2024-07-25 07:24:03.038717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:30.696 [2024-07-25 07:24:03.038733] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xefb040 00:18:30.696 [2024-07-25 07:24:03.038744] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:30.696 [2024-07-25 07:24:03.040245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:30.696 [2024-07-25 07:24:03.040289] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:30.696 [2024-07-25 07:24:03.040352] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:30.696 [2024-07-25 07:24:03.040378] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:30.696 [2024-07-25 07:24:03.040469] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:30.696 [2024-07-25 07:24:03.040481] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:30.696 [2024-07-25 07:24:03.040495] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefdaa0 name raid_bdev1, state configuring 00:18:30.696 [2024-07-25 07:24:03.040517] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:30.696 pt1 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 3 -gt 2 ']' 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.696 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:30.956 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.956 "name": "raid_bdev1", 00:18:30.956 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:30.956 "strip_size_kb": 0, 00:18:30.956 "state": "configuring", 00:18:30.956 "raid_level": "raid1", 00:18:30.956 "superblock": true, 00:18:30.956 "num_base_bdevs": 3, 00:18:30.956 "num_base_bdevs_discovered": 1, 00:18:30.956 "num_base_bdevs_operational": 2, 00:18:30.956 "base_bdevs_list": [ 00:18:30.956 { 00:18:30.956 "name": null, 00:18:30.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.956 "is_configured": false, 00:18:30.956 "data_offset": 2048, 00:18:30.956 "data_size": 63488 00:18:30.956 }, 00:18:30.956 { 00:18:30.956 "name": "pt2", 00:18:30.956 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:30.956 "is_configured": true, 00:18:30.956 "data_offset": 2048, 00:18:30.956 "data_size": 63488 00:18:30.956 }, 00:18:30.956 { 00:18:30.956 "name": null, 00:18:30.956 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:30.956 "is_configured": false, 00:18:30.956 "data_offset": 2048, 00:18:30.956 "data_size": 63488 00:18:30.956 } 00:18:30.956 ] 00:18:30.956 }' 00:18:30.956 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.956 07:24:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.523 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:31.523 07:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:31.782 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:18:31.782 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:32.041 [2024-07-25 07:24:04.550679] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:32.041 [2024-07-25 07:24:04.550730] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:32.041 [2024-07-25 07:24:04.550750] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd4d720 00:18:32.041 [2024-07-25 07:24:04.550762] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:32.041 [2024-07-25 07:24:04.551084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:32.041 [2024-07-25 07:24:04.551100] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:32.041 [2024-07-25 07:24:04.551167] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:32.041 [2024-07-25 07:24:04.551187] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:32.041 [2024-07-25 07:24:04.551283] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xefd4f0 00:18:32.041 [2024-07-25 07:24:04.551293] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:32.041 [2024-07-25 07:24:04.551462] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xefbad0 00:18:32.041 [2024-07-25 07:24:04.551580] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefd4f0 00:18:32.041 [2024-07-25 07:24:04.551589] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xefd4f0 00:18:32.041 [2024-07-25 07:24:04.551677] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.041 pt3 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.300 "name": "raid_bdev1", 00:18:32.300 "uuid": "c899d34e-bfc2-489b-b477-6f65068bd2c0", 00:18:32.300 "strip_size_kb": 0, 00:18:32.300 "state": "online", 00:18:32.300 "raid_level": "raid1", 00:18:32.300 "superblock": true, 00:18:32.300 "num_base_bdevs": 3, 00:18:32.300 "num_base_bdevs_discovered": 2, 00:18:32.300 "num_base_bdevs_operational": 2, 00:18:32.300 "base_bdevs_list": [ 00:18:32.300 { 00:18:32.300 "name": null, 00:18:32.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.300 "is_configured": false, 00:18:32.300 "data_offset": 2048, 00:18:32.300 "data_size": 63488 00:18:32.300 }, 00:18:32.300 { 00:18:32.300 "name": "pt2", 00:18:32.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:32.300 "is_configured": true, 00:18:32.300 "data_offset": 2048, 00:18:32.300 "data_size": 63488 00:18:32.300 }, 00:18:32.300 { 00:18:32.300 "name": "pt3", 00:18:32.300 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:32.300 "is_configured": true, 00:18:32.300 "data_offset": 2048, 00:18:32.300 "data_size": 63488 00:18:32.300 } 00:18:32.300 ] 00:18:32.300 }' 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.300 07:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.868 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:32.868 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:33.126 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:18:33.126 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:33.126 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:18:33.386 [2024-07-25 07:24:05.826263] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' c899d34e-bfc2-489b-b477-6f65068bd2c0 '!=' c899d34e-bfc2-489b-b477-6f65068bd2c0 ']' 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1653517 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1653517 ']' 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1653517 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1653517 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1653517' 00:18:33.386 killing process with pid 1653517 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1653517 00:18:33.386 [2024-07-25 07:24:05.904714] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:33.386 [2024-07-25 07:24:05.904764] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:33.386 [2024-07-25 07:24:05.904817] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:33.386 [2024-07-25 07:24:05.904828] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefd4f0 name raid_bdev1, state offline 00:18:33.386 07:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1653517 00:18:33.645 [2024-07-25 07:24:05.928804] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:33.645 07:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:18:33.645 00:18:33.645 real 0m21.241s 00:18:33.646 user 0m38.736s 00:18:33.646 sys 0m3.843s 00:18:33.646 07:24:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:33.646 07:24:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.646 ************************************ 00:18:33.646 END TEST raid_superblock_test 00:18:33.646 ************************************ 00:18:33.646 07:24:06 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:18:33.646 07:24:06 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:33.646 07:24:06 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:33.646 07:24:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:33.905 ************************************ 00:18:33.905 START TEST raid_read_error_test 00:18:33.905 ************************************ 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 read 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:33.905 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.QB7H5tRwfp 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1657989 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1657989 /var/tmp/spdk-raid.sock 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1657989 ']' 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:33.906 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:33.906 07:24:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.906 [2024-07-25 07:24:06.277408] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:18:33.906 [2024-07-25 07:24:06.277461] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657989 ] 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:33.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:33.906 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:33.906 [2024-07-25 07:24:06.409184] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:34.165 [2024-07-25 07:24:06.497299] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:34.166 [2024-07-25 07:24:06.557738] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:34.166 [2024-07-25 07:24:06.557781] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:34.734 07:24:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:34.734 07:24:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:34.734 07:24:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:34.734 07:24:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:34.993 BaseBdev1_malloc 00:18:34.993 07:24:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:35.253 true 00:18:35.253 07:24:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:35.512 [2024-07-25 07:24:07.867581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:35.512 [2024-07-25 07:24:07.867622] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.512 [2024-07-25 07:24:07.867639] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cbfa50 00:18:35.512 [2024-07-25 07:24:07.867651] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.512 [2024-07-25 07:24:07.869063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.512 [2024-07-25 07:24:07.869090] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:35.512 BaseBdev1 00:18:35.512 07:24:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:35.512 07:24:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:35.771 BaseBdev2_malloc 00:18:35.771 07:24:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:36.049 true 00:18:36.049 07:24:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:36.049 [2024-07-25 07:24:08.541803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:36.049 [2024-07-25 07:24:08.541840] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.049 [2024-07-25 07:24:08.541858] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e68f40 00:18:36.049 [2024-07-25 07:24:08.541870] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.049 [2024-07-25 07:24:08.543193] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.049 [2024-07-25 07:24:08.543220] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:36.049 BaseBdev2 00:18:36.049 07:24:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:36.049 07:24:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:36.349 BaseBdev3_malloc 00:18:36.349 07:24:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:36.608 true 00:18:36.608 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:36.866 [2024-07-25 07:24:09.251935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:36.866 [2024-07-25 07:24:09.251974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.866 [2024-07-25 07:24:09.251992] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e6c250 00:18:36.866 [2024-07-25 07:24:09.252003] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.866 [2024-07-25 07:24:09.253328] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.866 [2024-07-25 07:24:09.253354] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:36.866 BaseBdev3 00:18:36.867 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:37.125 [2024-07-25 07:24:09.492587] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:37.125 [2024-07-25 07:24:09.493800] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.125 [2024-07-25 07:24:09.493864] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:37.125 [2024-07-25 07:24:09.494064] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e6d010 00:18:37.125 [2024-07-25 07:24:09.494075] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:37.125 [2024-07-25 07:24:09.494260] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbbc70 00:18:37.125 [2024-07-25 07:24:09.494405] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e6d010 00:18:37.125 [2024-07-25 07:24:09.494414] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e6d010 00:18:37.125 [2024-07-25 07:24:09.494507] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.125 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.384 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.384 "name": "raid_bdev1", 00:18:37.384 "uuid": "2c9294f3-fd7e-4ffd-80f2-bd6be3030b10", 00:18:37.384 "strip_size_kb": 0, 00:18:37.384 "state": "online", 00:18:37.384 "raid_level": "raid1", 00:18:37.384 "superblock": true, 00:18:37.384 "num_base_bdevs": 3, 00:18:37.384 "num_base_bdevs_discovered": 3, 00:18:37.384 "num_base_bdevs_operational": 3, 00:18:37.384 "base_bdevs_list": [ 00:18:37.384 { 00:18:37.384 "name": "BaseBdev1", 00:18:37.384 "uuid": "3800ccf2-a2fa-571d-a7c6-8bacb14b7b4e", 00:18:37.384 "is_configured": true, 00:18:37.384 "data_offset": 2048, 00:18:37.384 "data_size": 63488 00:18:37.384 }, 00:18:37.384 { 00:18:37.384 "name": "BaseBdev2", 00:18:37.384 "uuid": "c3e07b81-3ff5-5e06-9add-7bc7393ac319", 00:18:37.384 "is_configured": true, 00:18:37.384 "data_offset": 2048, 00:18:37.384 "data_size": 63488 00:18:37.384 }, 00:18:37.384 { 00:18:37.384 "name": "BaseBdev3", 00:18:37.384 "uuid": "9c397847-d167-5366-9745-7eb192008762", 00:18:37.384 "is_configured": true, 00:18:37.384 "data_offset": 2048, 00:18:37.384 "data_size": 63488 00:18:37.384 } 00:18:37.384 ] 00:18:37.384 }' 00:18:37.384 07:24:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.384 07:24:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.952 07:24:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:37.952 07:24:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:37.952 [2024-07-25 07:24:10.415359] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e6de00 00:18:38.889 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.149 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.409 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.409 "name": "raid_bdev1", 00:18:39.409 "uuid": "2c9294f3-fd7e-4ffd-80f2-bd6be3030b10", 00:18:39.409 "strip_size_kb": 0, 00:18:39.409 "state": "online", 00:18:39.409 "raid_level": "raid1", 00:18:39.409 "superblock": true, 00:18:39.409 "num_base_bdevs": 3, 00:18:39.409 "num_base_bdevs_discovered": 3, 00:18:39.409 "num_base_bdevs_operational": 3, 00:18:39.409 "base_bdevs_list": [ 00:18:39.409 { 00:18:39.409 "name": "BaseBdev1", 00:18:39.409 "uuid": "3800ccf2-a2fa-571d-a7c6-8bacb14b7b4e", 00:18:39.409 "is_configured": true, 00:18:39.409 "data_offset": 2048, 00:18:39.409 "data_size": 63488 00:18:39.409 }, 00:18:39.409 { 00:18:39.409 "name": "BaseBdev2", 00:18:39.409 "uuid": "c3e07b81-3ff5-5e06-9add-7bc7393ac319", 00:18:39.409 "is_configured": true, 00:18:39.409 "data_offset": 2048, 00:18:39.409 "data_size": 63488 00:18:39.409 }, 00:18:39.409 { 00:18:39.409 "name": "BaseBdev3", 00:18:39.409 "uuid": "9c397847-d167-5366-9745-7eb192008762", 00:18:39.409 "is_configured": true, 00:18:39.409 "data_offset": 2048, 00:18:39.409 "data_size": 63488 00:18:39.409 } 00:18:39.409 ] 00:18:39.409 }' 00:18:39.409 07:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.409 07:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:39.979 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:40.238 [2024-07-25 07:24:12.583364] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:40.238 [2024-07-25 07:24:12.583399] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:40.238 [2024-07-25 07:24:12.586438] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:40.238 [2024-07-25 07:24:12.586470] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.238 [2024-07-25 07:24:12.586557] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:40.238 [2024-07-25 07:24:12.586568] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e6d010 name raid_bdev1, state offline 00:18:40.238 0 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1657989 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1657989 ']' 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1657989 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1657989 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1657989' 00:18:40.238 killing process with pid 1657989 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1657989 00:18:40.238 [2024-07-25 07:24:12.661965] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:40.238 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1657989 00:18:40.238 [2024-07-25 07:24:12.680928] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:40.497 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.QB7H5tRwfp 00:18:40.497 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:40.497 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:40.497 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:18:40.497 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:18:40.497 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:40.498 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:40.498 07:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:40.498 00:18:40.498 real 0m6.687s 00:18:40.498 user 0m10.591s 00:18:40.498 sys 0m1.142s 00:18:40.498 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:40.498 07:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.498 ************************************ 00:18:40.498 END TEST raid_read_error_test 00:18:40.498 ************************************ 00:18:40.498 07:24:12 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:18:40.498 07:24:12 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:40.498 07:24:12 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:40.498 07:24:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:40.498 ************************************ 00:18:40.498 START TEST raid_write_error_test 00:18:40.498 ************************************ 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 write 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.vzrHX4Toog 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1659366 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1659366 /var/tmp/spdk-raid.sock 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1659366 ']' 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:40.498 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:40.498 07:24:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.758 [2024-07-25 07:24:13.048354] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:18:40.758 [2024-07-25 07:24:13.048411] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659366 ] 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.758 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:40.758 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:40.759 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:40.759 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:40.759 [2024-07-25 07:24:13.180652] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:40.759 [2024-07-25 07:24:13.271414] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:41.024 [2024-07-25 07:24:13.327628] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:41.024 [2024-07-25 07:24:13.327655] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:41.594 07:24:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:41.594 07:24:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:41.594 07:24:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:41.594 07:24:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:41.853 BaseBdev1_malloc 00:18:41.853 07:24:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:42.112 true 00:18:42.112 07:24:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:42.112 [2024-07-25 07:24:14.608182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:42.112 [2024-07-25 07:24:14.608233] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.112 [2024-07-25 07:24:14.608259] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b2a50 00:18:42.112 [2024-07-25 07:24:14.608271] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.112 [2024-07-25 07:24:14.609788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.112 [2024-07-25 07:24:14.609815] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:42.112 BaseBdev1 00:18:42.112 07:24:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:42.112 07:24:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:42.372 BaseBdev2_malloc 00:18:42.372 07:24:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:42.631 true 00:18:42.631 07:24:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:42.890 [2024-07-25 07:24:15.314344] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:42.890 [2024-07-25 07:24:15.314382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.890 [2024-07-25 07:24:15.314399] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x195bf40 00:18:42.890 [2024-07-25 07:24:15.314411] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.890 [2024-07-25 07:24:15.315722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.890 [2024-07-25 07:24:15.315748] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:42.890 BaseBdev2 00:18:42.890 07:24:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:42.890 07:24:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:43.149 BaseBdev3_malloc 00:18:43.149 07:24:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:43.409 true 00:18:43.409 07:24:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:43.668 [2024-07-25 07:24:16.024408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:43.668 [2024-07-25 07:24:16.024449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.668 [2024-07-25 07:24:16.024465] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x195f250 00:18:43.668 [2024-07-25 07:24:16.024477] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.668 [2024-07-25 07:24:16.025856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.668 [2024-07-25 07:24:16.025884] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:43.668 BaseBdev3 00:18:43.668 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:43.927 [2024-07-25 07:24:16.245009] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:43.927 [2024-07-25 07:24:16.246189] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:43.927 [2024-07-25 07:24:16.246254] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:43.927 [2024-07-25 07:24:16.246451] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1960010 00:18:43.927 [2024-07-25 07:24:16.246462] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:43.927 [2024-07-25 07:24:16.246634] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17aec70 00:18:43.927 [2024-07-25 07:24:16.246773] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1960010 00:18:43.927 [2024-07-25 07:24:16.246782] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1960010 00:18:43.927 [2024-07-25 07:24:16.246874] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.927 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.187 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.187 "name": "raid_bdev1", 00:18:44.187 "uuid": "c8ed5966-1609-4899-bf4d-d1fa25e0192e", 00:18:44.187 "strip_size_kb": 0, 00:18:44.187 "state": "online", 00:18:44.187 "raid_level": "raid1", 00:18:44.187 "superblock": true, 00:18:44.187 "num_base_bdevs": 3, 00:18:44.187 "num_base_bdevs_discovered": 3, 00:18:44.187 "num_base_bdevs_operational": 3, 00:18:44.187 "base_bdevs_list": [ 00:18:44.187 { 00:18:44.187 "name": "BaseBdev1", 00:18:44.187 "uuid": "5d8d1c90-962b-5cfb-8fd8-e02b2993515e", 00:18:44.187 "is_configured": true, 00:18:44.187 "data_offset": 2048, 00:18:44.187 "data_size": 63488 00:18:44.187 }, 00:18:44.187 { 00:18:44.187 "name": "BaseBdev2", 00:18:44.187 "uuid": "9b7279f4-f67c-5165-9dc4-471c504f2e0e", 00:18:44.187 "is_configured": true, 00:18:44.187 "data_offset": 2048, 00:18:44.187 "data_size": 63488 00:18:44.187 }, 00:18:44.187 { 00:18:44.187 "name": "BaseBdev3", 00:18:44.187 "uuid": "7b48e139-21e0-594d-8e12-9b718dfa6710", 00:18:44.187 "is_configured": true, 00:18:44.187 "data_offset": 2048, 00:18:44.187 "data_size": 63488 00:18:44.187 } 00:18:44.187 ] 00:18:44.187 }' 00:18:44.187 07:24:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.187 07:24:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.755 07:24:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:44.755 07:24:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:44.755 [2024-07-25 07:24:17.139612] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1960e00 00:18:45.693 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:45.953 [2024-07-25 07:24:18.249851] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:45.953 [2024-07-25 07:24:18.249904] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:45.953 [2024-07-25 07:24:18.250104] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1960e00 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=2 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.953 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:46.212 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.212 "name": "raid_bdev1", 00:18:46.212 "uuid": "c8ed5966-1609-4899-bf4d-d1fa25e0192e", 00:18:46.212 "strip_size_kb": 0, 00:18:46.212 "state": "online", 00:18:46.212 "raid_level": "raid1", 00:18:46.212 "superblock": true, 00:18:46.212 "num_base_bdevs": 3, 00:18:46.213 "num_base_bdevs_discovered": 2, 00:18:46.213 "num_base_bdevs_operational": 2, 00:18:46.213 "base_bdevs_list": [ 00:18:46.213 { 00:18:46.213 "name": null, 00:18:46.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.213 "is_configured": false, 00:18:46.213 "data_offset": 2048, 00:18:46.213 "data_size": 63488 00:18:46.213 }, 00:18:46.213 { 00:18:46.213 "name": "BaseBdev2", 00:18:46.213 "uuid": "9b7279f4-f67c-5165-9dc4-471c504f2e0e", 00:18:46.213 "is_configured": true, 00:18:46.213 "data_offset": 2048, 00:18:46.213 "data_size": 63488 00:18:46.213 }, 00:18:46.213 { 00:18:46.213 "name": "BaseBdev3", 00:18:46.213 "uuid": "7b48e139-21e0-594d-8e12-9b718dfa6710", 00:18:46.213 "is_configured": true, 00:18:46.213 "data_offset": 2048, 00:18:46.213 "data_size": 63488 00:18:46.213 } 00:18:46.213 ] 00:18:46.213 }' 00:18:46.213 07:24:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.213 07:24:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:46.782 [2024-07-25 07:24:19.255330] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:46.782 [2024-07-25 07:24:19.255363] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:46.782 [2024-07-25 07:24:19.258251] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:46.782 [2024-07-25 07:24:19.258280] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:46.782 [2024-07-25 07:24:19.258346] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:46.782 [2024-07-25 07:24:19.258357] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1960010 name raid_bdev1, state offline 00:18:46.782 0 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1659366 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1659366 ']' 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1659366 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:46.782 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1659366 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1659366' 00:18:47.041 killing process with pid 1659366 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1659366 00:18:47.041 [2024-07-25 07:24:19.330994] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1659366 00:18:47.041 [2024-07-25 07:24:19.349690] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.vzrHX4Toog 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:47.041 00:18:47.041 real 0m6.580s 00:18:47.041 user 0m10.346s 00:18:47.041 sys 0m1.177s 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:47.041 07:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.041 ************************************ 00:18:47.041 END TEST raid_write_error_test 00:18:47.041 ************************************ 00:18:47.301 07:24:19 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:18:47.301 07:24:19 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:18:47.301 07:24:19 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:18:47.301 07:24:19 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:47.301 07:24:19 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:47.301 07:24:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:47.301 ************************************ 00:18:47.301 START TEST raid_state_function_test 00:18:47.301 ************************************ 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 false 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:47.301 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1660520 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1660520' 00:18:47.302 Process raid pid: 1660520 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1660520 /var/tmp/spdk-raid.sock 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1660520 ']' 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:47.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:47.302 07:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.302 [2024-07-25 07:24:19.682600] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:18:47.302 [2024-07-25 07:24:19.682654] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:47.302 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:47.302 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:47.302 [2024-07-25 07:24:19.815165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:47.561 [2024-07-25 07:24:19.902240] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:47.561 [2024-07-25 07:24:19.961149] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:47.561 [2024-07-25 07:24:19.961185] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:48.131 07:24:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:48.131 07:24:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:18:48.131 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:48.390 [2024-07-25 07:24:20.804824] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:48.390 [2024-07-25 07:24:20.804863] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:48.390 [2024-07-25 07:24:20.804873] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:48.390 [2024-07-25 07:24:20.804884] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:48.390 [2024-07-25 07:24:20.804896] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:48.390 [2024-07-25 07:24:20.804906] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:48.390 [2024-07-25 07:24:20.804914] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:48.390 [2024-07-25 07:24:20.804924] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.390 07:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.649 07:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.649 "name": "Existed_Raid", 00:18:48.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.649 "strip_size_kb": 64, 00:18:48.649 "state": "configuring", 00:18:48.649 "raid_level": "raid0", 00:18:48.649 "superblock": false, 00:18:48.649 "num_base_bdevs": 4, 00:18:48.649 "num_base_bdevs_discovered": 0, 00:18:48.649 "num_base_bdevs_operational": 4, 00:18:48.649 "base_bdevs_list": [ 00:18:48.649 { 00:18:48.649 "name": "BaseBdev1", 00:18:48.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.649 "is_configured": false, 00:18:48.649 "data_offset": 0, 00:18:48.649 "data_size": 0 00:18:48.649 }, 00:18:48.649 { 00:18:48.649 "name": "BaseBdev2", 00:18:48.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.649 "is_configured": false, 00:18:48.649 "data_offset": 0, 00:18:48.649 "data_size": 0 00:18:48.649 }, 00:18:48.649 { 00:18:48.649 "name": "BaseBdev3", 00:18:48.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.649 "is_configured": false, 00:18:48.649 "data_offset": 0, 00:18:48.649 "data_size": 0 00:18:48.649 }, 00:18:48.649 { 00:18:48.649 "name": "BaseBdev4", 00:18:48.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.649 "is_configured": false, 00:18:48.649 "data_offset": 0, 00:18:48.649 "data_size": 0 00:18:48.649 } 00:18:48.649 ] 00:18:48.649 }' 00:18:48.649 07:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.649 07:24:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.246 07:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:49.246 [2024-07-25 07:24:21.771218] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:49.246 [2024-07-25 07:24:21.771243] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dcee0 name Existed_Raid, state configuring 00:18:49.567 07:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:49.567 [2024-07-25 07:24:21.987806] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:49.567 [2024-07-25 07:24:21.987833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:49.567 [2024-07-25 07:24:21.987843] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:49.567 [2024-07-25 07:24:21.987860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:49.567 [2024-07-25 07:24:21.987869] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:49.567 [2024-07-25 07:24:21.987880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:49.567 [2024-07-25 07:24:21.987889] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:49.567 [2024-07-25 07:24:21.987899] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:49.567 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:49.827 [2024-07-25 07:24:22.225903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:49.827 BaseBdev1 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:49.827 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:50.086 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:50.346 [ 00:18:50.346 { 00:18:50.346 "name": "BaseBdev1", 00:18:50.346 "aliases": [ 00:18:50.346 "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf" 00:18:50.346 ], 00:18:50.346 "product_name": "Malloc disk", 00:18:50.346 "block_size": 512, 00:18:50.346 "num_blocks": 65536, 00:18:50.346 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:50.346 "assigned_rate_limits": { 00:18:50.346 "rw_ios_per_sec": 0, 00:18:50.346 "rw_mbytes_per_sec": 0, 00:18:50.346 "r_mbytes_per_sec": 0, 00:18:50.346 "w_mbytes_per_sec": 0 00:18:50.346 }, 00:18:50.346 "claimed": true, 00:18:50.346 "claim_type": "exclusive_write", 00:18:50.346 "zoned": false, 00:18:50.346 "supported_io_types": { 00:18:50.346 "read": true, 00:18:50.346 "write": true, 00:18:50.346 "unmap": true, 00:18:50.346 "flush": true, 00:18:50.346 "reset": true, 00:18:50.346 "nvme_admin": false, 00:18:50.346 "nvme_io": false, 00:18:50.346 "nvme_io_md": false, 00:18:50.346 "write_zeroes": true, 00:18:50.346 "zcopy": true, 00:18:50.346 "get_zone_info": false, 00:18:50.346 "zone_management": false, 00:18:50.346 "zone_append": false, 00:18:50.346 "compare": false, 00:18:50.346 "compare_and_write": false, 00:18:50.346 "abort": true, 00:18:50.346 "seek_hole": false, 00:18:50.346 "seek_data": false, 00:18:50.346 "copy": true, 00:18:50.346 "nvme_iov_md": false 00:18:50.346 }, 00:18:50.346 "memory_domains": [ 00:18:50.346 { 00:18:50.346 "dma_device_id": "system", 00:18:50.346 "dma_device_type": 1 00:18:50.346 }, 00:18:50.346 { 00:18:50.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.346 "dma_device_type": 2 00:18:50.346 } 00:18:50.346 ], 00:18:50.346 "driver_specific": {} 00:18:50.346 } 00:18:50.346 ] 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.346 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.605 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.605 "name": "Existed_Raid", 00:18:50.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.605 "strip_size_kb": 64, 00:18:50.605 "state": "configuring", 00:18:50.605 "raid_level": "raid0", 00:18:50.606 "superblock": false, 00:18:50.606 "num_base_bdevs": 4, 00:18:50.606 "num_base_bdevs_discovered": 1, 00:18:50.606 "num_base_bdevs_operational": 4, 00:18:50.606 "base_bdevs_list": [ 00:18:50.606 { 00:18:50.606 "name": "BaseBdev1", 00:18:50.606 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:50.606 "is_configured": true, 00:18:50.606 "data_offset": 0, 00:18:50.606 "data_size": 65536 00:18:50.606 }, 00:18:50.606 { 00:18:50.606 "name": "BaseBdev2", 00:18:50.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.606 "is_configured": false, 00:18:50.606 "data_offset": 0, 00:18:50.606 "data_size": 0 00:18:50.606 }, 00:18:50.606 { 00:18:50.606 "name": "BaseBdev3", 00:18:50.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.606 "is_configured": false, 00:18:50.606 "data_offset": 0, 00:18:50.606 "data_size": 0 00:18:50.606 }, 00:18:50.606 { 00:18:50.606 "name": "BaseBdev4", 00:18:50.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.606 "is_configured": false, 00:18:50.606 "data_offset": 0, 00:18:50.606 "data_size": 0 00:18:50.606 } 00:18:50.606 ] 00:18:50.606 }' 00:18:50.606 07:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.606 07:24:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.173 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:51.173 [2024-07-25 07:24:23.697787] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:51.173 [2024-07-25 07:24:23.697824] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dc750 name Existed_Raid, state configuring 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:51.432 [2024-07-25 07:24:23.926426] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:51.432 [2024-07-25 07:24:23.927808] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:51.432 [2024-07-25 07:24:23.927839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:51.432 [2024-07-25 07:24:23.927849] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:51.432 [2024-07-25 07:24:23.927859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:51.432 [2024-07-25 07:24:23.927867] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:51.432 [2024-07-25 07:24:23.927878] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.432 07:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.691 07:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.691 "name": "Existed_Raid", 00:18:51.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.691 "strip_size_kb": 64, 00:18:51.691 "state": "configuring", 00:18:51.691 "raid_level": "raid0", 00:18:51.691 "superblock": false, 00:18:51.691 "num_base_bdevs": 4, 00:18:51.691 "num_base_bdevs_discovered": 1, 00:18:51.691 "num_base_bdevs_operational": 4, 00:18:51.691 "base_bdevs_list": [ 00:18:51.691 { 00:18:51.691 "name": "BaseBdev1", 00:18:51.691 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:51.691 "is_configured": true, 00:18:51.691 "data_offset": 0, 00:18:51.691 "data_size": 65536 00:18:51.691 }, 00:18:51.691 { 00:18:51.691 "name": "BaseBdev2", 00:18:51.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.691 "is_configured": false, 00:18:51.691 "data_offset": 0, 00:18:51.691 "data_size": 0 00:18:51.691 }, 00:18:51.691 { 00:18:51.691 "name": "BaseBdev3", 00:18:51.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.692 "is_configured": false, 00:18:51.692 "data_offset": 0, 00:18:51.692 "data_size": 0 00:18:51.692 }, 00:18:51.692 { 00:18:51.692 "name": "BaseBdev4", 00:18:51.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.692 "is_configured": false, 00:18:51.692 "data_offset": 0, 00:18:51.692 "data_size": 0 00:18:51.692 } 00:18:51.692 ] 00:18:51.692 }' 00:18:51.692 07:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.692 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.260 07:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:52.519 [2024-07-25 07:24:24.956258] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:52.519 BaseBdev2 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:52.519 07:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:52.778 07:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:53.038 [ 00:18:53.038 { 00:18:53.038 "name": "BaseBdev2", 00:18:53.038 "aliases": [ 00:18:53.038 "7dcd3540-665b-4dc4-8e33-d335c46f7b9b" 00:18:53.038 ], 00:18:53.038 "product_name": "Malloc disk", 00:18:53.038 "block_size": 512, 00:18:53.038 "num_blocks": 65536, 00:18:53.038 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:18:53.038 "assigned_rate_limits": { 00:18:53.038 "rw_ios_per_sec": 0, 00:18:53.038 "rw_mbytes_per_sec": 0, 00:18:53.038 "r_mbytes_per_sec": 0, 00:18:53.038 "w_mbytes_per_sec": 0 00:18:53.038 }, 00:18:53.038 "claimed": true, 00:18:53.038 "claim_type": "exclusive_write", 00:18:53.038 "zoned": false, 00:18:53.038 "supported_io_types": { 00:18:53.038 "read": true, 00:18:53.038 "write": true, 00:18:53.038 "unmap": true, 00:18:53.038 "flush": true, 00:18:53.038 "reset": true, 00:18:53.038 "nvme_admin": false, 00:18:53.038 "nvme_io": false, 00:18:53.038 "nvme_io_md": false, 00:18:53.038 "write_zeroes": true, 00:18:53.038 "zcopy": true, 00:18:53.038 "get_zone_info": false, 00:18:53.038 "zone_management": false, 00:18:53.038 "zone_append": false, 00:18:53.038 "compare": false, 00:18:53.038 "compare_and_write": false, 00:18:53.038 "abort": true, 00:18:53.038 "seek_hole": false, 00:18:53.038 "seek_data": false, 00:18:53.038 "copy": true, 00:18:53.038 "nvme_iov_md": false 00:18:53.038 }, 00:18:53.038 "memory_domains": [ 00:18:53.038 { 00:18:53.038 "dma_device_id": "system", 00:18:53.038 "dma_device_type": 1 00:18:53.038 }, 00:18:53.038 { 00:18:53.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.038 "dma_device_type": 2 00:18:53.038 } 00:18:53.038 ], 00:18:53.038 "driver_specific": {} 00:18:53.038 } 00:18:53.038 ] 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.038 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.297 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.297 "name": "Existed_Raid", 00:18:53.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.297 "strip_size_kb": 64, 00:18:53.297 "state": "configuring", 00:18:53.297 "raid_level": "raid0", 00:18:53.297 "superblock": false, 00:18:53.297 "num_base_bdevs": 4, 00:18:53.297 "num_base_bdevs_discovered": 2, 00:18:53.297 "num_base_bdevs_operational": 4, 00:18:53.297 "base_bdevs_list": [ 00:18:53.297 { 00:18:53.297 "name": "BaseBdev1", 00:18:53.297 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:53.297 "is_configured": true, 00:18:53.297 "data_offset": 0, 00:18:53.297 "data_size": 65536 00:18:53.297 }, 00:18:53.297 { 00:18:53.297 "name": "BaseBdev2", 00:18:53.297 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:18:53.297 "is_configured": true, 00:18:53.297 "data_offset": 0, 00:18:53.297 "data_size": 65536 00:18:53.297 }, 00:18:53.297 { 00:18:53.297 "name": "BaseBdev3", 00:18:53.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.297 "is_configured": false, 00:18:53.297 "data_offset": 0, 00:18:53.297 "data_size": 0 00:18:53.297 }, 00:18:53.297 { 00:18:53.297 "name": "BaseBdev4", 00:18:53.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.297 "is_configured": false, 00:18:53.297 "data_offset": 0, 00:18:53.297 "data_size": 0 00:18:53.297 } 00:18:53.297 ] 00:18:53.297 }' 00:18:53.297 07:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.297 07:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.865 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:54.124 [2024-07-25 07:24:26.439574] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:54.124 BaseBdev3 00:18:54.124 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:54.124 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:54.124 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:54.124 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:54.124 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:54.124 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:54.125 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:54.384 [ 00:18:54.384 { 00:18:54.384 "name": "BaseBdev3", 00:18:54.384 "aliases": [ 00:18:54.384 "e4c6988e-897a-40aa-8e55-1371b51842ee" 00:18:54.384 ], 00:18:54.384 "product_name": "Malloc disk", 00:18:54.384 "block_size": 512, 00:18:54.384 "num_blocks": 65536, 00:18:54.384 "uuid": "e4c6988e-897a-40aa-8e55-1371b51842ee", 00:18:54.384 "assigned_rate_limits": { 00:18:54.384 "rw_ios_per_sec": 0, 00:18:54.384 "rw_mbytes_per_sec": 0, 00:18:54.384 "r_mbytes_per_sec": 0, 00:18:54.384 "w_mbytes_per_sec": 0 00:18:54.384 }, 00:18:54.384 "claimed": true, 00:18:54.384 "claim_type": "exclusive_write", 00:18:54.384 "zoned": false, 00:18:54.384 "supported_io_types": { 00:18:54.384 "read": true, 00:18:54.384 "write": true, 00:18:54.384 "unmap": true, 00:18:54.384 "flush": true, 00:18:54.384 "reset": true, 00:18:54.384 "nvme_admin": false, 00:18:54.384 "nvme_io": false, 00:18:54.384 "nvme_io_md": false, 00:18:54.384 "write_zeroes": true, 00:18:54.384 "zcopy": true, 00:18:54.384 "get_zone_info": false, 00:18:54.384 "zone_management": false, 00:18:54.384 "zone_append": false, 00:18:54.384 "compare": false, 00:18:54.384 "compare_and_write": false, 00:18:54.384 "abort": true, 00:18:54.384 "seek_hole": false, 00:18:54.384 "seek_data": false, 00:18:54.384 "copy": true, 00:18:54.384 "nvme_iov_md": false 00:18:54.384 }, 00:18:54.384 "memory_domains": [ 00:18:54.384 { 00:18:54.384 "dma_device_id": "system", 00:18:54.384 "dma_device_type": 1 00:18:54.384 }, 00:18:54.384 { 00:18:54.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.384 "dma_device_type": 2 00:18:54.384 } 00:18:54.384 ], 00:18:54.384 "driver_specific": {} 00:18:54.384 } 00:18:54.384 ] 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.384 07:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.644 07:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.644 "name": "Existed_Raid", 00:18:54.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.644 "strip_size_kb": 64, 00:18:54.644 "state": "configuring", 00:18:54.644 "raid_level": "raid0", 00:18:54.644 "superblock": false, 00:18:54.644 "num_base_bdevs": 4, 00:18:54.644 "num_base_bdevs_discovered": 3, 00:18:54.644 "num_base_bdevs_operational": 4, 00:18:54.644 "base_bdevs_list": [ 00:18:54.644 { 00:18:54.644 "name": "BaseBdev1", 00:18:54.644 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:54.644 "is_configured": true, 00:18:54.644 "data_offset": 0, 00:18:54.644 "data_size": 65536 00:18:54.644 }, 00:18:54.644 { 00:18:54.644 "name": "BaseBdev2", 00:18:54.644 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:18:54.644 "is_configured": true, 00:18:54.644 "data_offset": 0, 00:18:54.644 "data_size": 65536 00:18:54.644 }, 00:18:54.644 { 00:18:54.644 "name": "BaseBdev3", 00:18:54.644 "uuid": "e4c6988e-897a-40aa-8e55-1371b51842ee", 00:18:54.644 "is_configured": true, 00:18:54.644 "data_offset": 0, 00:18:54.644 "data_size": 65536 00:18:54.644 }, 00:18:54.644 { 00:18:54.644 "name": "BaseBdev4", 00:18:54.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.644 "is_configured": false, 00:18:54.644 "data_offset": 0, 00:18:54.644 "data_size": 0 00:18:54.644 } 00:18:54.644 ] 00:18:54.644 }' 00:18:54.644 07:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.644 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.211 07:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:55.469 [2024-07-25 07:24:27.918672] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:55.469 [2024-07-25 07:24:27.918705] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x8dd7b0 00:18:55.469 [2024-07-25 07:24:27.918713] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:55.469 [2024-07-25 07:24:27.918883] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa909d0 00:18:55.469 [2024-07-25 07:24:27.919002] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8dd7b0 00:18:55.469 [2024-07-25 07:24:27.919012] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8dd7b0 00:18:55.469 [2024-07-25 07:24:27.919165] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:55.469 BaseBdev4 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:55.469 07:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:55.728 07:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:55.987 [ 00:18:55.987 { 00:18:55.987 "name": "BaseBdev4", 00:18:55.987 "aliases": [ 00:18:55.987 "073c49b0-ce8a-4449-b73f-dbc20ccb5b84" 00:18:55.987 ], 00:18:55.987 "product_name": "Malloc disk", 00:18:55.987 "block_size": 512, 00:18:55.987 "num_blocks": 65536, 00:18:55.987 "uuid": "073c49b0-ce8a-4449-b73f-dbc20ccb5b84", 00:18:55.987 "assigned_rate_limits": { 00:18:55.987 "rw_ios_per_sec": 0, 00:18:55.987 "rw_mbytes_per_sec": 0, 00:18:55.987 "r_mbytes_per_sec": 0, 00:18:55.987 "w_mbytes_per_sec": 0 00:18:55.987 }, 00:18:55.987 "claimed": true, 00:18:55.987 "claim_type": "exclusive_write", 00:18:55.987 "zoned": false, 00:18:55.987 "supported_io_types": { 00:18:55.987 "read": true, 00:18:55.987 "write": true, 00:18:55.987 "unmap": true, 00:18:55.987 "flush": true, 00:18:55.987 "reset": true, 00:18:55.987 "nvme_admin": false, 00:18:55.987 "nvme_io": false, 00:18:55.987 "nvme_io_md": false, 00:18:55.987 "write_zeroes": true, 00:18:55.987 "zcopy": true, 00:18:55.987 "get_zone_info": false, 00:18:55.987 "zone_management": false, 00:18:55.987 "zone_append": false, 00:18:55.987 "compare": false, 00:18:55.987 "compare_and_write": false, 00:18:55.987 "abort": true, 00:18:55.987 "seek_hole": false, 00:18:55.987 "seek_data": false, 00:18:55.987 "copy": true, 00:18:55.987 "nvme_iov_md": false 00:18:55.987 }, 00:18:55.987 "memory_domains": [ 00:18:55.987 { 00:18:55.987 "dma_device_id": "system", 00:18:55.987 "dma_device_type": 1 00:18:55.987 }, 00:18:55.987 { 00:18:55.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.987 "dma_device_type": 2 00:18:55.987 } 00:18:55.987 ], 00:18:55.987 "driver_specific": {} 00:18:55.987 } 00:18:55.987 ] 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.987 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.247 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.247 "name": "Existed_Raid", 00:18:56.247 "uuid": "867db84e-a007-408b-bf2e-3929151b5973", 00:18:56.247 "strip_size_kb": 64, 00:18:56.247 "state": "online", 00:18:56.247 "raid_level": "raid0", 00:18:56.247 "superblock": false, 00:18:56.247 "num_base_bdevs": 4, 00:18:56.247 "num_base_bdevs_discovered": 4, 00:18:56.247 "num_base_bdevs_operational": 4, 00:18:56.247 "base_bdevs_list": [ 00:18:56.247 { 00:18:56.247 "name": "BaseBdev1", 00:18:56.247 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:56.247 "is_configured": true, 00:18:56.247 "data_offset": 0, 00:18:56.247 "data_size": 65536 00:18:56.247 }, 00:18:56.247 { 00:18:56.247 "name": "BaseBdev2", 00:18:56.247 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:18:56.247 "is_configured": true, 00:18:56.247 "data_offset": 0, 00:18:56.247 "data_size": 65536 00:18:56.247 }, 00:18:56.247 { 00:18:56.247 "name": "BaseBdev3", 00:18:56.247 "uuid": "e4c6988e-897a-40aa-8e55-1371b51842ee", 00:18:56.247 "is_configured": true, 00:18:56.247 "data_offset": 0, 00:18:56.247 "data_size": 65536 00:18:56.247 }, 00:18:56.247 { 00:18:56.247 "name": "BaseBdev4", 00:18:56.247 "uuid": "073c49b0-ce8a-4449-b73f-dbc20ccb5b84", 00:18:56.247 "is_configured": true, 00:18:56.247 "data_offset": 0, 00:18:56.247 "data_size": 65536 00:18:56.247 } 00:18:56.247 ] 00:18:56.247 }' 00:18:56.247 07:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.247 07:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:56.813 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:57.073 [2024-07-25 07:24:29.406915] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:57.073 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:57.073 "name": "Existed_Raid", 00:18:57.073 "aliases": [ 00:18:57.073 "867db84e-a007-408b-bf2e-3929151b5973" 00:18:57.073 ], 00:18:57.073 "product_name": "Raid Volume", 00:18:57.073 "block_size": 512, 00:18:57.073 "num_blocks": 262144, 00:18:57.073 "uuid": "867db84e-a007-408b-bf2e-3929151b5973", 00:18:57.073 "assigned_rate_limits": { 00:18:57.073 "rw_ios_per_sec": 0, 00:18:57.073 "rw_mbytes_per_sec": 0, 00:18:57.073 "r_mbytes_per_sec": 0, 00:18:57.073 "w_mbytes_per_sec": 0 00:18:57.073 }, 00:18:57.073 "claimed": false, 00:18:57.073 "zoned": false, 00:18:57.073 "supported_io_types": { 00:18:57.073 "read": true, 00:18:57.073 "write": true, 00:18:57.073 "unmap": true, 00:18:57.073 "flush": true, 00:18:57.073 "reset": true, 00:18:57.073 "nvme_admin": false, 00:18:57.073 "nvme_io": false, 00:18:57.073 "nvme_io_md": false, 00:18:57.073 "write_zeroes": true, 00:18:57.073 "zcopy": false, 00:18:57.073 "get_zone_info": false, 00:18:57.073 "zone_management": false, 00:18:57.073 "zone_append": false, 00:18:57.073 "compare": false, 00:18:57.073 "compare_and_write": false, 00:18:57.073 "abort": false, 00:18:57.073 "seek_hole": false, 00:18:57.073 "seek_data": false, 00:18:57.073 "copy": false, 00:18:57.073 "nvme_iov_md": false 00:18:57.073 }, 00:18:57.073 "memory_domains": [ 00:18:57.073 { 00:18:57.073 "dma_device_id": "system", 00:18:57.073 "dma_device_type": 1 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.073 "dma_device_type": 2 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "system", 00:18:57.073 "dma_device_type": 1 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.073 "dma_device_type": 2 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "system", 00:18:57.073 "dma_device_type": 1 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.073 "dma_device_type": 2 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "system", 00:18:57.073 "dma_device_type": 1 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.073 "dma_device_type": 2 00:18:57.073 } 00:18:57.073 ], 00:18:57.073 "driver_specific": { 00:18:57.073 "raid": { 00:18:57.073 "uuid": "867db84e-a007-408b-bf2e-3929151b5973", 00:18:57.073 "strip_size_kb": 64, 00:18:57.073 "state": "online", 00:18:57.073 "raid_level": "raid0", 00:18:57.073 "superblock": false, 00:18:57.073 "num_base_bdevs": 4, 00:18:57.073 "num_base_bdevs_discovered": 4, 00:18:57.073 "num_base_bdevs_operational": 4, 00:18:57.073 "base_bdevs_list": [ 00:18:57.073 { 00:18:57.073 "name": "BaseBdev1", 00:18:57.073 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:57.073 "is_configured": true, 00:18:57.073 "data_offset": 0, 00:18:57.073 "data_size": 65536 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "name": "BaseBdev2", 00:18:57.073 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:18:57.073 "is_configured": true, 00:18:57.073 "data_offset": 0, 00:18:57.073 "data_size": 65536 00:18:57.073 }, 00:18:57.073 { 00:18:57.073 "name": "BaseBdev3", 00:18:57.073 "uuid": "e4c6988e-897a-40aa-8e55-1371b51842ee", 00:18:57.073 "is_configured": true, 00:18:57.073 "data_offset": 0, 00:18:57.073 "data_size": 65536 00:18:57.074 }, 00:18:57.074 { 00:18:57.074 "name": "BaseBdev4", 00:18:57.074 "uuid": "073c49b0-ce8a-4449-b73f-dbc20ccb5b84", 00:18:57.074 "is_configured": true, 00:18:57.074 "data_offset": 0, 00:18:57.074 "data_size": 65536 00:18:57.074 } 00:18:57.074 ] 00:18:57.074 } 00:18:57.074 } 00:18:57.074 }' 00:18:57.074 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:57.074 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:57.074 BaseBdev2 00:18:57.074 BaseBdev3 00:18:57.074 BaseBdev4' 00:18:57.074 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:57.074 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:57.074 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:57.333 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.333 "name": "BaseBdev1", 00:18:57.333 "aliases": [ 00:18:57.333 "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf" 00:18:57.333 ], 00:18:57.333 "product_name": "Malloc disk", 00:18:57.333 "block_size": 512, 00:18:57.333 "num_blocks": 65536, 00:18:57.333 "uuid": "cb02458b-e252-41c1-88fe-5b9f2e2bbcdf", 00:18:57.333 "assigned_rate_limits": { 00:18:57.333 "rw_ios_per_sec": 0, 00:18:57.333 "rw_mbytes_per_sec": 0, 00:18:57.333 "r_mbytes_per_sec": 0, 00:18:57.333 "w_mbytes_per_sec": 0 00:18:57.333 }, 00:18:57.333 "claimed": true, 00:18:57.333 "claim_type": "exclusive_write", 00:18:57.333 "zoned": false, 00:18:57.333 "supported_io_types": { 00:18:57.333 "read": true, 00:18:57.333 "write": true, 00:18:57.333 "unmap": true, 00:18:57.333 "flush": true, 00:18:57.333 "reset": true, 00:18:57.333 "nvme_admin": false, 00:18:57.333 "nvme_io": false, 00:18:57.333 "nvme_io_md": false, 00:18:57.333 "write_zeroes": true, 00:18:57.333 "zcopy": true, 00:18:57.333 "get_zone_info": false, 00:18:57.333 "zone_management": false, 00:18:57.333 "zone_append": false, 00:18:57.333 "compare": false, 00:18:57.333 "compare_and_write": false, 00:18:57.333 "abort": true, 00:18:57.333 "seek_hole": false, 00:18:57.333 "seek_data": false, 00:18:57.333 "copy": true, 00:18:57.333 "nvme_iov_md": false 00:18:57.333 }, 00:18:57.333 "memory_domains": [ 00:18:57.333 { 00:18:57.333 "dma_device_id": "system", 00:18:57.333 "dma_device_type": 1 00:18:57.333 }, 00:18:57.333 { 00:18:57.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.333 "dma_device_type": 2 00:18:57.333 } 00:18:57.333 ], 00:18:57.333 "driver_specific": {} 00:18:57.333 }' 00:18:57.333 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.333 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.333 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.333 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.333 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.592 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:57.592 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.592 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.593 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.593 07:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.593 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.593 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:57.593 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:57.593 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:57.593 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:57.852 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.852 "name": "BaseBdev2", 00:18:57.852 "aliases": [ 00:18:57.852 "7dcd3540-665b-4dc4-8e33-d335c46f7b9b" 00:18:57.852 ], 00:18:57.852 "product_name": "Malloc disk", 00:18:57.852 "block_size": 512, 00:18:57.852 "num_blocks": 65536, 00:18:57.852 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:18:57.852 "assigned_rate_limits": { 00:18:57.852 "rw_ios_per_sec": 0, 00:18:57.852 "rw_mbytes_per_sec": 0, 00:18:57.852 "r_mbytes_per_sec": 0, 00:18:57.852 "w_mbytes_per_sec": 0 00:18:57.852 }, 00:18:57.852 "claimed": true, 00:18:57.852 "claim_type": "exclusive_write", 00:18:57.852 "zoned": false, 00:18:57.852 "supported_io_types": { 00:18:57.852 "read": true, 00:18:57.852 "write": true, 00:18:57.852 "unmap": true, 00:18:57.852 "flush": true, 00:18:57.852 "reset": true, 00:18:57.852 "nvme_admin": false, 00:18:57.852 "nvme_io": false, 00:18:57.852 "nvme_io_md": false, 00:18:57.852 "write_zeroes": true, 00:18:57.852 "zcopy": true, 00:18:57.852 "get_zone_info": false, 00:18:57.852 "zone_management": false, 00:18:57.852 "zone_append": false, 00:18:57.852 "compare": false, 00:18:57.852 "compare_and_write": false, 00:18:57.852 "abort": true, 00:18:57.852 "seek_hole": false, 00:18:57.852 "seek_data": false, 00:18:57.852 "copy": true, 00:18:57.852 "nvme_iov_md": false 00:18:57.852 }, 00:18:57.852 "memory_domains": [ 00:18:57.852 { 00:18:57.852 "dma_device_id": "system", 00:18:57.852 "dma_device_type": 1 00:18:57.852 }, 00:18:57.852 { 00:18:57.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.852 "dma_device_type": 2 00:18:57.852 } 00:18:57.852 ], 00:18:57.852 "driver_specific": {} 00:18:57.852 }' 00:18:57.852 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.852 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.852 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.852 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:58.111 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.370 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.370 "name": "BaseBdev3", 00:18:58.370 "aliases": [ 00:18:58.370 "e4c6988e-897a-40aa-8e55-1371b51842ee" 00:18:58.370 ], 00:18:58.370 "product_name": "Malloc disk", 00:18:58.370 "block_size": 512, 00:18:58.370 "num_blocks": 65536, 00:18:58.370 "uuid": "e4c6988e-897a-40aa-8e55-1371b51842ee", 00:18:58.370 "assigned_rate_limits": { 00:18:58.370 "rw_ios_per_sec": 0, 00:18:58.370 "rw_mbytes_per_sec": 0, 00:18:58.370 "r_mbytes_per_sec": 0, 00:18:58.370 "w_mbytes_per_sec": 0 00:18:58.370 }, 00:18:58.370 "claimed": true, 00:18:58.370 "claim_type": "exclusive_write", 00:18:58.370 "zoned": false, 00:18:58.370 "supported_io_types": { 00:18:58.370 "read": true, 00:18:58.370 "write": true, 00:18:58.370 "unmap": true, 00:18:58.370 "flush": true, 00:18:58.370 "reset": true, 00:18:58.370 "nvme_admin": false, 00:18:58.370 "nvme_io": false, 00:18:58.370 "nvme_io_md": false, 00:18:58.370 "write_zeroes": true, 00:18:58.370 "zcopy": true, 00:18:58.370 "get_zone_info": false, 00:18:58.370 "zone_management": false, 00:18:58.370 "zone_append": false, 00:18:58.370 "compare": false, 00:18:58.370 "compare_and_write": false, 00:18:58.370 "abort": true, 00:18:58.370 "seek_hole": false, 00:18:58.370 "seek_data": false, 00:18:58.370 "copy": true, 00:18:58.370 "nvme_iov_md": false 00:18:58.370 }, 00:18:58.370 "memory_domains": [ 00:18:58.370 { 00:18:58.370 "dma_device_id": "system", 00:18:58.370 "dma_device_type": 1 00:18:58.370 }, 00:18:58.370 { 00:18:58.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.370 "dma_device_type": 2 00:18:58.370 } 00:18:58.370 ], 00:18:58.370 "driver_specific": {} 00:18:58.370 }' 00:18:58.370 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.629 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.629 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:58.629 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.629 07:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.629 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:58.629 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.629 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.629 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.629 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.888 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.888 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.888 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.888 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:58.888 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:59.456 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:59.456 "name": "BaseBdev4", 00:18:59.456 "aliases": [ 00:18:59.456 "073c49b0-ce8a-4449-b73f-dbc20ccb5b84" 00:18:59.456 ], 00:18:59.456 "product_name": "Malloc disk", 00:18:59.456 "block_size": 512, 00:18:59.456 "num_blocks": 65536, 00:18:59.456 "uuid": "073c49b0-ce8a-4449-b73f-dbc20ccb5b84", 00:18:59.456 "assigned_rate_limits": { 00:18:59.456 "rw_ios_per_sec": 0, 00:18:59.456 "rw_mbytes_per_sec": 0, 00:18:59.456 "r_mbytes_per_sec": 0, 00:18:59.456 "w_mbytes_per_sec": 0 00:18:59.456 }, 00:18:59.456 "claimed": true, 00:18:59.456 "claim_type": "exclusive_write", 00:18:59.456 "zoned": false, 00:18:59.456 "supported_io_types": { 00:18:59.456 "read": true, 00:18:59.456 "write": true, 00:18:59.456 "unmap": true, 00:18:59.456 "flush": true, 00:18:59.456 "reset": true, 00:18:59.456 "nvme_admin": false, 00:18:59.457 "nvme_io": false, 00:18:59.457 "nvme_io_md": false, 00:18:59.457 "write_zeroes": true, 00:18:59.457 "zcopy": true, 00:18:59.457 "get_zone_info": false, 00:18:59.457 "zone_management": false, 00:18:59.457 "zone_append": false, 00:18:59.457 "compare": false, 00:18:59.457 "compare_and_write": false, 00:18:59.457 "abort": true, 00:18:59.457 "seek_hole": false, 00:18:59.457 "seek_data": false, 00:18:59.457 "copy": true, 00:18:59.457 "nvme_iov_md": false 00:18:59.457 }, 00:18:59.457 "memory_domains": [ 00:18:59.457 { 00:18:59.457 "dma_device_id": "system", 00:18:59.457 "dma_device_type": 1 00:18:59.457 }, 00:18:59.457 { 00:18:59.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.457 "dma_device_type": 2 00:18:59.457 } 00:18:59.457 ], 00:18:59.457 "driver_specific": {} 00:18:59.457 }' 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:59.457 07:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.716 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.716 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:59.716 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:59.975 [2024-07-25 07:24:32.298270] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:59.975 [2024-07-25 07:24:32.298294] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:59.975 [2024-07-25 07:24:32.298337] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.975 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:00.234 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:00.234 "name": "Existed_Raid", 00:19:00.234 "uuid": "867db84e-a007-408b-bf2e-3929151b5973", 00:19:00.234 "strip_size_kb": 64, 00:19:00.234 "state": "offline", 00:19:00.234 "raid_level": "raid0", 00:19:00.234 "superblock": false, 00:19:00.234 "num_base_bdevs": 4, 00:19:00.234 "num_base_bdevs_discovered": 3, 00:19:00.234 "num_base_bdevs_operational": 3, 00:19:00.234 "base_bdevs_list": [ 00:19:00.234 { 00:19:00.234 "name": null, 00:19:00.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.234 "is_configured": false, 00:19:00.234 "data_offset": 0, 00:19:00.234 "data_size": 65536 00:19:00.234 }, 00:19:00.234 { 00:19:00.234 "name": "BaseBdev2", 00:19:00.234 "uuid": "7dcd3540-665b-4dc4-8e33-d335c46f7b9b", 00:19:00.234 "is_configured": true, 00:19:00.234 "data_offset": 0, 00:19:00.234 "data_size": 65536 00:19:00.234 }, 00:19:00.234 { 00:19:00.234 "name": "BaseBdev3", 00:19:00.234 "uuid": "e4c6988e-897a-40aa-8e55-1371b51842ee", 00:19:00.234 "is_configured": true, 00:19:00.234 "data_offset": 0, 00:19:00.234 "data_size": 65536 00:19:00.234 }, 00:19:00.234 { 00:19:00.234 "name": "BaseBdev4", 00:19:00.234 "uuid": "073c49b0-ce8a-4449-b73f-dbc20ccb5b84", 00:19:00.234 "is_configured": true, 00:19:00.234 "data_offset": 0, 00:19:00.234 "data_size": 65536 00:19:00.234 } 00:19:00.234 ] 00:19:00.234 }' 00:19:00.234 07:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:00.234 07:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.802 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:00.802 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:00.802 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:00.802 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:01.061 [2024-07-25 07:24:33.538510] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.061 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:01.320 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:01.320 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:01.320 07:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:01.886 [2024-07-25 07:24:34.270532] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:01.886 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:01.886 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:01.886 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.886 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:02.145 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:02.145 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:02.145 07:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:02.753 [2024-07-25 07:24:35.018457] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:02.753 [2024-07-25 07:24:35.018498] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dd7b0 name Existed_Raid, state offline 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:02.753 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:03.012 BaseBdev2 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:03.012 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:03.271 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:03.530 [ 00:19:03.530 { 00:19:03.530 "name": "BaseBdev2", 00:19:03.530 "aliases": [ 00:19:03.530 "b95ec86a-976f-40ea-8385-6eb0e08072bb" 00:19:03.530 ], 00:19:03.530 "product_name": "Malloc disk", 00:19:03.530 "block_size": 512, 00:19:03.530 "num_blocks": 65536, 00:19:03.530 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:03.530 "assigned_rate_limits": { 00:19:03.530 "rw_ios_per_sec": 0, 00:19:03.530 "rw_mbytes_per_sec": 0, 00:19:03.530 "r_mbytes_per_sec": 0, 00:19:03.530 "w_mbytes_per_sec": 0 00:19:03.530 }, 00:19:03.530 "claimed": false, 00:19:03.530 "zoned": false, 00:19:03.530 "supported_io_types": { 00:19:03.530 "read": true, 00:19:03.530 "write": true, 00:19:03.530 "unmap": true, 00:19:03.530 "flush": true, 00:19:03.530 "reset": true, 00:19:03.530 "nvme_admin": false, 00:19:03.530 "nvme_io": false, 00:19:03.530 "nvme_io_md": false, 00:19:03.530 "write_zeroes": true, 00:19:03.530 "zcopy": true, 00:19:03.530 "get_zone_info": false, 00:19:03.530 "zone_management": false, 00:19:03.530 "zone_append": false, 00:19:03.530 "compare": false, 00:19:03.530 "compare_and_write": false, 00:19:03.530 "abort": true, 00:19:03.530 "seek_hole": false, 00:19:03.530 "seek_data": false, 00:19:03.530 "copy": true, 00:19:03.530 "nvme_iov_md": false 00:19:03.530 }, 00:19:03.530 "memory_domains": [ 00:19:03.530 { 00:19:03.530 "dma_device_id": "system", 00:19:03.530 "dma_device_type": 1 00:19:03.530 }, 00:19:03.530 { 00:19:03.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.530 "dma_device_type": 2 00:19:03.530 } 00:19:03.530 ], 00:19:03.530 "driver_specific": {} 00:19:03.530 } 00:19:03.530 ] 00:19:03.530 07:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:03.530 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:03.530 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:03.530 07:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:03.789 BaseBdev3 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:03.789 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:04.048 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:04.616 [ 00:19:04.616 { 00:19:04.616 "name": "BaseBdev3", 00:19:04.616 "aliases": [ 00:19:04.616 "281a6409-b60b-4049-9c0e-537a401cd138" 00:19:04.616 ], 00:19:04.616 "product_name": "Malloc disk", 00:19:04.616 "block_size": 512, 00:19:04.616 "num_blocks": 65536, 00:19:04.616 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:04.616 "assigned_rate_limits": { 00:19:04.616 "rw_ios_per_sec": 0, 00:19:04.616 "rw_mbytes_per_sec": 0, 00:19:04.616 "r_mbytes_per_sec": 0, 00:19:04.616 "w_mbytes_per_sec": 0 00:19:04.616 }, 00:19:04.616 "claimed": false, 00:19:04.616 "zoned": false, 00:19:04.616 "supported_io_types": { 00:19:04.616 "read": true, 00:19:04.616 "write": true, 00:19:04.616 "unmap": true, 00:19:04.616 "flush": true, 00:19:04.616 "reset": true, 00:19:04.616 "nvme_admin": false, 00:19:04.616 "nvme_io": false, 00:19:04.616 "nvme_io_md": false, 00:19:04.616 "write_zeroes": true, 00:19:04.616 "zcopy": true, 00:19:04.616 "get_zone_info": false, 00:19:04.616 "zone_management": false, 00:19:04.616 "zone_append": false, 00:19:04.616 "compare": false, 00:19:04.616 "compare_and_write": false, 00:19:04.616 "abort": true, 00:19:04.616 "seek_hole": false, 00:19:04.616 "seek_data": false, 00:19:04.616 "copy": true, 00:19:04.616 "nvme_iov_md": false 00:19:04.616 }, 00:19:04.616 "memory_domains": [ 00:19:04.616 { 00:19:04.616 "dma_device_id": "system", 00:19:04.616 "dma_device_type": 1 00:19:04.616 }, 00:19:04.616 { 00:19:04.616 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.616 "dma_device_type": 2 00:19:04.616 } 00:19:04.616 ], 00:19:04.616 "driver_specific": {} 00:19:04.616 } 00:19:04.616 ] 00:19:04.616 07:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:04.616 07:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:04.616 07:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:04.616 07:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:04.875 BaseBdev4 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:04.875 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:05.442 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:05.442 [ 00:19:05.442 { 00:19:05.442 "name": "BaseBdev4", 00:19:05.442 "aliases": [ 00:19:05.442 "8ec33c5f-bf4c-495d-9095-f393a9bb64c8" 00:19:05.442 ], 00:19:05.442 "product_name": "Malloc disk", 00:19:05.442 "block_size": 512, 00:19:05.442 "num_blocks": 65536, 00:19:05.442 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:05.442 "assigned_rate_limits": { 00:19:05.442 "rw_ios_per_sec": 0, 00:19:05.442 "rw_mbytes_per_sec": 0, 00:19:05.442 "r_mbytes_per_sec": 0, 00:19:05.442 "w_mbytes_per_sec": 0 00:19:05.442 }, 00:19:05.442 "claimed": false, 00:19:05.442 "zoned": false, 00:19:05.442 "supported_io_types": { 00:19:05.442 "read": true, 00:19:05.442 "write": true, 00:19:05.442 "unmap": true, 00:19:05.442 "flush": true, 00:19:05.442 "reset": true, 00:19:05.442 "nvme_admin": false, 00:19:05.442 "nvme_io": false, 00:19:05.442 "nvme_io_md": false, 00:19:05.442 "write_zeroes": true, 00:19:05.442 "zcopy": true, 00:19:05.442 "get_zone_info": false, 00:19:05.442 "zone_management": false, 00:19:05.442 "zone_append": false, 00:19:05.442 "compare": false, 00:19:05.442 "compare_and_write": false, 00:19:05.442 "abort": true, 00:19:05.442 "seek_hole": false, 00:19:05.442 "seek_data": false, 00:19:05.442 "copy": true, 00:19:05.442 "nvme_iov_md": false 00:19:05.442 }, 00:19:05.442 "memory_domains": [ 00:19:05.442 { 00:19:05.442 "dma_device_id": "system", 00:19:05.442 "dma_device_type": 1 00:19:05.442 }, 00:19:05.442 { 00:19:05.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.442 "dma_device_type": 2 00:19:05.442 } 00:19:05.442 ], 00:19:05.442 "driver_specific": {} 00:19:05.442 } 00:19:05.442 ] 00:19:05.442 07:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:05.442 07:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:05.442 07:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:05.443 07:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:06.010 [2024-07-25 07:24:38.390153] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:06.010 [2024-07-25 07:24:38.390192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:06.010 [2024-07-25 07:24:38.390209] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:06.010 [2024-07-25 07:24:38.391425] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:06.010 [2024-07-25 07:24:38.391462] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.010 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.270 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.270 "name": "Existed_Raid", 00:19:06.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.270 "strip_size_kb": 64, 00:19:06.270 "state": "configuring", 00:19:06.270 "raid_level": "raid0", 00:19:06.270 "superblock": false, 00:19:06.270 "num_base_bdevs": 4, 00:19:06.270 "num_base_bdevs_discovered": 3, 00:19:06.270 "num_base_bdevs_operational": 4, 00:19:06.270 "base_bdevs_list": [ 00:19:06.270 { 00:19:06.270 "name": "BaseBdev1", 00:19:06.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.270 "is_configured": false, 00:19:06.270 "data_offset": 0, 00:19:06.270 "data_size": 0 00:19:06.270 }, 00:19:06.270 { 00:19:06.270 "name": "BaseBdev2", 00:19:06.270 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:06.270 "is_configured": true, 00:19:06.270 "data_offset": 0, 00:19:06.270 "data_size": 65536 00:19:06.270 }, 00:19:06.270 { 00:19:06.270 "name": "BaseBdev3", 00:19:06.270 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:06.270 "is_configured": true, 00:19:06.270 "data_offset": 0, 00:19:06.270 "data_size": 65536 00:19:06.270 }, 00:19:06.270 { 00:19:06.270 "name": "BaseBdev4", 00:19:06.270 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:06.270 "is_configured": true, 00:19:06.270 "data_offset": 0, 00:19:06.270 "data_size": 65536 00:19:06.270 } 00:19:06.270 ] 00:19:06.270 }' 00:19:06.270 07:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.270 07:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.838 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:07.097 [2024-07-25 07:24:39.400776] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.097 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.357 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.357 "name": "Existed_Raid", 00:19:07.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.357 "strip_size_kb": 64, 00:19:07.357 "state": "configuring", 00:19:07.357 "raid_level": "raid0", 00:19:07.357 "superblock": false, 00:19:07.357 "num_base_bdevs": 4, 00:19:07.357 "num_base_bdevs_discovered": 2, 00:19:07.357 "num_base_bdevs_operational": 4, 00:19:07.357 "base_bdevs_list": [ 00:19:07.357 { 00:19:07.357 "name": "BaseBdev1", 00:19:07.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.357 "is_configured": false, 00:19:07.357 "data_offset": 0, 00:19:07.357 "data_size": 0 00:19:07.357 }, 00:19:07.357 { 00:19:07.357 "name": null, 00:19:07.357 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:07.357 "is_configured": false, 00:19:07.357 "data_offset": 0, 00:19:07.357 "data_size": 65536 00:19:07.357 }, 00:19:07.357 { 00:19:07.357 "name": "BaseBdev3", 00:19:07.357 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:07.357 "is_configured": true, 00:19:07.357 "data_offset": 0, 00:19:07.357 "data_size": 65536 00:19:07.357 }, 00:19:07.357 { 00:19:07.357 "name": "BaseBdev4", 00:19:07.357 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:07.357 "is_configured": true, 00:19:07.357 "data_offset": 0, 00:19:07.357 "data_size": 65536 00:19:07.357 } 00:19:07.357 ] 00:19:07.357 }' 00:19:07.357 07:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.357 07:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.924 07:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.925 07:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:07.925 07:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:07.925 07:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:08.183 [2024-07-25 07:24:40.631148] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:08.183 BaseBdev1 00:19:08.183 07:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:08.183 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:08.184 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:08.184 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:08.184 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:08.184 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:08.184 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:08.442 07:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:08.700 [ 00:19:08.700 { 00:19:08.700 "name": "BaseBdev1", 00:19:08.700 "aliases": [ 00:19:08.700 "6a5e5291-25c2-4e5f-852e-0a0f24b1a268" 00:19:08.700 ], 00:19:08.700 "product_name": "Malloc disk", 00:19:08.700 "block_size": 512, 00:19:08.700 "num_blocks": 65536, 00:19:08.700 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:08.700 "assigned_rate_limits": { 00:19:08.700 "rw_ios_per_sec": 0, 00:19:08.700 "rw_mbytes_per_sec": 0, 00:19:08.700 "r_mbytes_per_sec": 0, 00:19:08.700 "w_mbytes_per_sec": 0 00:19:08.700 }, 00:19:08.700 "claimed": true, 00:19:08.700 "claim_type": "exclusive_write", 00:19:08.700 "zoned": false, 00:19:08.700 "supported_io_types": { 00:19:08.700 "read": true, 00:19:08.700 "write": true, 00:19:08.700 "unmap": true, 00:19:08.700 "flush": true, 00:19:08.700 "reset": true, 00:19:08.700 "nvme_admin": false, 00:19:08.700 "nvme_io": false, 00:19:08.700 "nvme_io_md": false, 00:19:08.700 "write_zeroes": true, 00:19:08.700 "zcopy": true, 00:19:08.700 "get_zone_info": false, 00:19:08.700 "zone_management": false, 00:19:08.700 "zone_append": false, 00:19:08.700 "compare": false, 00:19:08.700 "compare_and_write": false, 00:19:08.700 "abort": true, 00:19:08.700 "seek_hole": false, 00:19:08.700 "seek_data": false, 00:19:08.700 "copy": true, 00:19:08.700 "nvme_iov_md": false 00:19:08.700 }, 00:19:08.700 "memory_domains": [ 00:19:08.700 { 00:19:08.700 "dma_device_id": "system", 00:19:08.701 "dma_device_type": 1 00:19:08.701 }, 00:19:08.701 { 00:19:08.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.701 "dma_device_type": 2 00:19:08.701 } 00:19:08.701 ], 00:19:08.701 "driver_specific": {} 00:19:08.701 } 00:19:08.701 ] 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.701 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.959 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.959 "name": "Existed_Raid", 00:19:08.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.959 "strip_size_kb": 64, 00:19:08.959 "state": "configuring", 00:19:08.959 "raid_level": "raid0", 00:19:08.959 "superblock": false, 00:19:08.959 "num_base_bdevs": 4, 00:19:08.959 "num_base_bdevs_discovered": 3, 00:19:08.959 "num_base_bdevs_operational": 4, 00:19:08.959 "base_bdevs_list": [ 00:19:08.959 { 00:19:08.959 "name": "BaseBdev1", 00:19:08.959 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:08.959 "is_configured": true, 00:19:08.959 "data_offset": 0, 00:19:08.959 "data_size": 65536 00:19:08.959 }, 00:19:08.959 { 00:19:08.959 "name": null, 00:19:08.959 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:08.959 "is_configured": false, 00:19:08.959 "data_offset": 0, 00:19:08.959 "data_size": 65536 00:19:08.959 }, 00:19:08.959 { 00:19:08.959 "name": "BaseBdev3", 00:19:08.959 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:08.959 "is_configured": true, 00:19:08.959 "data_offset": 0, 00:19:08.959 "data_size": 65536 00:19:08.959 }, 00:19:08.959 { 00:19:08.959 "name": "BaseBdev4", 00:19:08.959 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:08.959 "is_configured": true, 00:19:08.959 "data_offset": 0, 00:19:08.959 "data_size": 65536 00:19:08.959 } 00:19:08.959 ] 00:19:08.959 }' 00:19:08.959 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.959 07:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.526 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.526 07:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:09.785 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:09.785 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:10.043 [2024-07-25 07:24:42.355710] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.043 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.303 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.303 "name": "Existed_Raid", 00:19:10.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.303 "strip_size_kb": 64, 00:19:10.303 "state": "configuring", 00:19:10.303 "raid_level": "raid0", 00:19:10.303 "superblock": false, 00:19:10.303 "num_base_bdevs": 4, 00:19:10.303 "num_base_bdevs_discovered": 2, 00:19:10.303 "num_base_bdevs_operational": 4, 00:19:10.303 "base_bdevs_list": [ 00:19:10.303 { 00:19:10.303 "name": "BaseBdev1", 00:19:10.303 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:10.303 "is_configured": true, 00:19:10.303 "data_offset": 0, 00:19:10.303 "data_size": 65536 00:19:10.303 }, 00:19:10.303 { 00:19:10.303 "name": null, 00:19:10.303 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:10.303 "is_configured": false, 00:19:10.303 "data_offset": 0, 00:19:10.303 "data_size": 65536 00:19:10.303 }, 00:19:10.303 { 00:19:10.303 "name": null, 00:19:10.303 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:10.303 "is_configured": false, 00:19:10.303 "data_offset": 0, 00:19:10.303 "data_size": 65536 00:19:10.303 }, 00:19:10.303 { 00:19:10.303 "name": "BaseBdev4", 00:19:10.303 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:10.303 "is_configured": true, 00:19:10.303 "data_offset": 0, 00:19:10.303 "data_size": 65536 00:19:10.303 } 00:19:10.303 ] 00:19:10.303 }' 00:19:10.303 07:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.303 07:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.870 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.870 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:11.128 [2024-07-25 07:24:43.623070] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.128 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.386 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.386 "name": "Existed_Raid", 00:19:11.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.386 "strip_size_kb": 64, 00:19:11.386 "state": "configuring", 00:19:11.386 "raid_level": "raid0", 00:19:11.386 "superblock": false, 00:19:11.386 "num_base_bdevs": 4, 00:19:11.386 "num_base_bdevs_discovered": 3, 00:19:11.386 "num_base_bdevs_operational": 4, 00:19:11.386 "base_bdevs_list": [ 00:19:11.386 { 00:19:11.386 "name": "BaseBdev1", 00:19:11.386 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:11.386 "is_configured": true, 00:19:11.386 "data_offset": 0, 00:19:11.386 "data_size": 65536 00:19:11.386 }, 00:19:11.386 { 00:19:11.386 "name": null, 00:19:11.386 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:11.386 "is_configured": false, 00:19:11.386 "data_offset": 0, 00:19:11.386 "data_size": 65536 00:19:11.386 }, 00:19:11.386 { 00:19:11.386 "name": "BaseBdev3", 00:19:11.386 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:11.386 "is_configured": true, 00:19:11.386 "data_offset": 0, 00:19:11.386 "data_size": 65536 00:19:11.386 }, 00:19:11.386 { 00:19:11.386 "name": "BaseBdev4", 00:19:11.386 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:11.386 "is_configured": true, 00:19:11.386 "data_offset": 0, 00:19:11.386 "data_size": 65536 00:19:11.386 } 00:19:11.386 ] 00:19:11.386 }' 00:19:11.386 07:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.386 07:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.952 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.952 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:12.210 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:12.210 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:12.469 [2024-07-25 07:24:44.842286] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.469 07:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.727 07:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.727 "name": "Existed_Raid", 00:19:12.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.727 "strip_size_kb": 64, 00:19:12.727 "state": "configuring", 00:19:12.727 "raid_level": "raid0", 00:19:12.727 "superblock": false, 00:19:12.727 "num_base_bdevs": 4, 00:19:12.727 "num_base_bdevs_discovered": 2, 00:19:12.727 "num_base_bdevs_operational": 4, 00:19:12.727 "base_bdevs_list": [ 00:19:12.727 { 00:19:12.727 "name": null, 00:19:12.727 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:12.727 "is_configured": false, 00:19:12.727 "data_offset": 0, 00:19:12.727 "data_size": 65536 00:19:12.727 }, 00:19:12.727 { 00:19:12.727 "name": null, 00:19:12.727 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:12.727 "is_configured": false, 00:19:12.727 "data_offset": 0, 00:19:12.727 "data_size": 65536 00:19:12.727 }, 00:19:12.727 { 00:19:12.727 "name": "BaseBdev3", 00:19:12.727 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:12.727 "is_configured": true, 00:19:12.727 "data_offset": 0, 00:19:12.727 "data_size": 65536 00:19:12.727 }, 00:19:12.727 { 00:19:12.727 "name": "BaseBdev4", 00:19:12.727 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:12.727 "is_configured": true, 00:19:12.727 "data_offset": 0, 00:19:12.727 "data_size": 65536 00:19:12.727 } 00:19:12.727 ] 00:19:12.727 }' 00:19:12.727 07:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.727 07:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:13.293 07:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.293 07:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:13.552 07:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:13.552 07:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:13.810 [2024-07-25 07:24:46.111483] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.810 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.068 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.068 "name": "Existed_Raid", 00:19:14.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.068 "strip_size_kb": 64, 00:19:14.068 "state": "configuring", 00:19:14.068 "raid_level": "raid0", 00:19:14.068 "superblock": false, 00:19:14.068 "num_base_bdevs": 4, 00:19:14.068 "num_base_bdevs_discovered": 3, 00:19:14.068 "num_base_bdevs_operational": 4, 00:19:14.068 "base_bdevs_list": [ 00:19:14.068 { 00:19:14.068 "name": null, 00:19:14.068 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:14.068 "is_configured": false, 00:19:14.068 "data_offset": 0, 00:19:14.068 "data_size": 65536 00:19:14.068 }, 00:19:14.068 { 00:19:14.068 "name": "BaseBdev2", 00:19:14.069 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:14.069 "is_configured": true, 00:19:14.069 "data_offset": 0, 00:19:14.069 "data_size": 65536 00:19:14.069 }, 00:19:14.069 { 00:19:14.069 "name": "BaseBdev3", 00:19:14.069 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:14.069 "is_configured": true, 00:19:14.069 "data_offset": 0, 00:19:14.069 "data_size": 65536 00:19:14.069 }, 00:19:14.069 { 00:19:14.069 "name": "BaseBdev4", 00:19:14.069 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:14.069 "is_configured": true, 00:19:14.069 "data_offset": 0, 00:19:14.069 "data_size": 65536 00:19:14.069 } 00:19:14.069 ] 00:19:14.069 }' 00:19:14.069 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.069 07:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.635 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.635 07:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:14.635 07:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:14.635 07:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.635 07:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:14.894 07:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6a5e5291-25c2-4e5f-852e-0a0f24b1a268 00:19:15.153 [2024-07-25 07:24:47.606665] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:15.153 [2024-07-25 07:24:47.606698] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x8d4a30 00:19:15.153 [2024-07-25 07:24:47.606706] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:15.153 [2024-07-25 07:24:47.606879] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8d4ef0 00:19:15.153 [2024-07-25 07:24:47.606986] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8d4a30 00:19:15.153 [2024-07-25 07:24:47.606994] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8d4a30 00:19:15.153 [2024-07-25 07:24:47.607151] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:15.153 NewBaseBdev 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:15.153 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.412 07:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:15.706 [ 00:19:15.706 { 00:19:15.706 "name": "NewBaseBdev", 00:19:15.706 "aliases": [ 00:19:15.706 "6a5e5291-25c2-4e5f-852e-0a0f24b1a268" 00:19:15.706 ], 00:19:15.706 "product_name": "Malloc disk", 00:19:15.706 "block_size": 512, 00:19:15.706 "num_blocks": 65536, 00:19:15.706 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:15.706 "assigned_rate_limits": { 00:19:15.706 "rw_ios_per_sec": 0, 00:19:15.706 "rw_mbytes_per_sec": 0, 00:19:15.706 "r_mbytes_per_sec": 0, 00:19:15.706 "w_mbytes_per_sec": 0 00:19:15.706 }, 00:19:15.706 "claimed": true, 00:19:15.706 "claim_type": "exclusive_write", 00:19:15.706 "zoned": false, 00:19:15.706 "supported_io_types": { 00:19:15.706 "read": true, 00:19:15.706 "write": true, 00:19:15.706 "unmap": true, 00:19:15.706 "flush": true, 00:19:15.706 "reset": true, 00:19:15.706 "nvme_admin": false, 00:19:15.706 "nvme_io": false, 00:19:15.706 "nvme_io_md": false, 00:19:15.706 "write_zeroes": true, 00:19:15.706 "zcopy": true, 00:19:15.706 "get_zone_info": false, 00:19:15.706 "zone_management": false, 00:19:15.706 "zone_append": false, 00:19:15.706 "compare": false, 00:19:15.706 "compare_and_write": false, 00:19:15.706 "abort": true, 00:19:15.706 "seek_hole": false, 00:19:15.706 "seek_data": false, 00:19:15.706 "copy": true, 00:19:15.706 "nvme_iov_md": false 00:19:15.706 }, 00:19:15.706 "memory_domains": [ 00:19:15.706 { 00:19:15.706 "dma_device_id": "system", 00:19:15.706 "dma_device_type": 1 00:19:15.706 }, 00:19:15.706 { 00:19:15.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.706 "dma_device_type": 2 00:19:15.706 } 00:19:15.706 ], 00:19:15.706 "driver_specific": {} 00:19:15.706 } 00:19:15.706 ] 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.706 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.983 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.983 "name": "Existed_Raid", 00:19:15.983 "uuid": "5355aad0-9af4-4c65-ab14-5f25b25d248f", 00:19:15.983 "strip_size_kb": 64, 00:19:15.983 "state": "online", 00:19:15.983 "raid_level": "raid0", 00:19:15.983 "superblock": false, 00:19:15.983 "num_base_bdevs": 4, 00:19:15.983 "num_base_bdevs_discovered": 4, 00:19:15.983 "num_base_bdevs_operational": 4, 00:19:15.983 "base_bdevs_list": [ 00:19:15.983 { 00:19:15.983 "name": "NewBaseBdev", 00:19:15.983 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:15.983 "is_configured": true, 00:19:15.983 "data_offset": 0, 00:19:15.983 "data_size": 65536 00:19:15.983 }, 00:19:15.983 { 00:19:15.983 "name": "BaseBdev2", 00:19:15.983 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:15.983 "is_configured": true, 00:19:15.983 "data_offset": 0, 00:19:15.983 "data_size": 65536 00:19:15.983 }, 00:19:15.983 { 00:19:15.983 "name": "BaseBdev3", 00:19:15.983 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:15.983 "is_configured": true, 00:19:15.983 "data_offset": 0, 00:19:15.983 "data_size": 65536 00:19:15.983 }, 00:19:15.983 { 00:19:15.983 "name": "BaseBdev4", 00:19:15.983 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:15.983 "is_configured": true, 00:19:15.983 "data_offset": 0, 00:19:15.983 "data_size": 65536 00:19:15.983 } 00:19:15.983 ] 00:19:15.983 }' 00:19:15.983 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.983 07:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:16.551 07:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:16.551 [2024-07-25 07:24:49.066937] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:16.811 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:16.811 "name": "Existed_Raid", 00:19:16.811 "aliases": [ 00:19:16.811 "5355aad0-9af4-4c65-ab14-5f25b25d248f" 00:19:16.811 ], 00:19:16.811 "product_name": "Raid Volume", 00:19:16.811 "block_size": 512, 00:19:16.811 "num_blocks": 262144, 00:19:16.811 "uuid": "5355aad0-9af4-4c65-ab14-5f25b25d248f", 00:19:16.811 "assigned_rate_limits": { 00:19:16.811 "rw_ios_per_sec": 0, 00:19:16.811 "rw_mbytes_per_sec": 0, 00:19:16.811 "r_mbytes_per_sec": 0, 00:19:16.811 "w_mbytes_per_sec": 0 00:19:16.811 }, 00:19:16.811 "claimed": false, 00:19:16.811 "zoned": false, 00:19:16.811 "supported_io_types": { 00:19:16.811 "read": true, 00:19:16.811 "write": true, 00:19:16.811 "unmap": true, 00:19:16.811 "flush": true, 00:19:16.811 "reset": true, 00:19:16.811 "nvme_admin": false, 00:19:16.811 "nvme_io": false, 00:19:16.811 "nvme_io_md": false, 00:19:16.811 "write_zeroes": true, 00:19:16.811 "zcopy": false, 00:19:16.811 "get_zone_info": false, 00:19:16.811 "zone_management": false, 00:19:16.811 "zone_append": false, 00:19:16.811 "compare": false, 00:19:16.811 "compare_and_write": false, 00:19:16.811 "abort": false, 00:19:16.811 "seek_hole": false, 00:19:16.811 "seek_data": false, 00:19:16.811 "copy": false, 00:19:16.811 "nvme_iov_md": false 00:19:16.811 }, 00:19:16.811 "memory_domains": [ 00:19:16.811 { 00:19:16.811 "dma_device_id": "system", 00:19:16.811 "dma_device_type": 1 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.811 "dma_device_type": 2 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "system", 00:19:16.811 "dma_device_type": 1 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.811 "dma_device_type": 2 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "system", 00:19:16.811 "dma_device_type": 1 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.811 "dma_device_type": 2 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "system", 00:19:16.811 "dma_device_type": 1 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.811 "dma_device_type": 2 00:19:16.811 } 00:19:16.811 ], 00:19:16.811 "driver_specific": { 00:19:16.811 "raid": { 00:19:16.811 "uuid": "5355aad0-9af4-4c65-ab14-5f25b25d248f", 00:19:16.811 "strip_size_kb": 64, 00:19:16.811 "state": "online", 00:19:16.811 "raid_level": "raid0", 00:19:16.811 "superblock": false, 00:19:16.811 "num_base_bdevs": 4, 00:19:16.811 "num_base_bdevs_discovered": 4, 00:19:16.811 "num_base_bdevs_operational": 4, 00:19:16.811 "base_bdevs_list": [ 00:19:16.811 { 00:19:16.811 "name": "NewBaseBdev", 00:19:16.811 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:16.811 "is_configured": true, 00:19:16.811 "data_offset": 0, 00:19:16.811 "data_size": 65536 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "name": "BaseBdev2", 00:19:16.811 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:16.811 "is_configured": true, 00:19:16.811 "data_offset": 0, 00:19:16.811 "data_size": 65536 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "name": "BaseBdev3", 00:19:16.811 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:16.811 "is_configured": true, 00:19:16.811 "data_offset": 0, 00:19:16.811 "data_size": 65536 00:19:16.811 }, 00:19:16.811 { 00:19:16.811 "name": "BaseBdev4", 00:19:16.811 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:16.811 "is_configured": true, 00:19:16.811 "data_offset": 0, 00:19:16.811 "data_size": 65536 00:19:16.811 } 00:19:16.811 ] 00:19:16.811 } 00:19:16.811 } 00:19:16.811 }' 00:19:16.811 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:16.811 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:16.811 BaseBdev2 00:19:16.811 BaseBdev3 00:19:16.811 BaseBdev4' 00:19:16.811 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:16.811 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:16.811 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:17.071 "name": "NewBaseBdev", 00:19:17.071 "aliases": [ 00:19:17.071 "6a5e5291-25c2-4e5f-852e-0a0f24b1a268" 00:19:17.071 ], 00:19:17.071 "product_name": "Malloc disk", 00:19:17.071 "block_size": 512, 00:19:17.071 "num_blocks": 65536, 00:19:17.071 "uuid": "6a5e5291-25c2-4e5f-852e-0a0f24b1a268", 00:19:17.071 "assigned_rate_limits": { 00:19:17.071 "rw_ios_per_sec": 0, 00:19:17.071 "rw_mbytes_per_sec": 0, 00:19:17.071 "r_mbytes_per_sec": 0, 00:19:17.071 "w_mbytes_per_sec": 0 00:19:17.071 }, 00:19:17.071 "claimed": true, 00:19:17.071 "claim_type": "exclusive_write", 00:19:17.071 "zoned": false, 00:19:17.071 "supported_io_types": { 00:19:17.071 "read": true, 00:19:17.071 "write": true, 00:19:17.071 "unmap": true, 00:19:17.071 "flush": true, 00:19:17.071 "reset": true, 00:19:17.071 "nvme_admin": false, 00:19:17.071 "nvme_io": false, 00:19:17.071 "nvme_io_md": false, 00:19:17.071 "write_zeroes": true, 00:19:17.071 "zcopy": true, 00:19:17.071 "get_zone_info": false, 00:19:17.071 "zone_management": false, 00:19:17.071 "zone_append": false, 00:19:17.071 "compare": false, 00:19:17.071 "compare_and_write": false, 00:19:17.071 "abort": true, 00:19:17.071 "seek_hole": false, 00:19:17.071 "seek_data": false, 00:19:17.071 "copy": true, 00:19:17.071 "nvme_iov_md": false 00:19:17.071 }, 00:19:17.071 "memory_domains": [ 00:19:17.071 { 00:19:17.071 "dma_device_id": "system", 00:19:17.071 "dma_device_type": 1 00:19:17.071 }, 00:19:17.071 { 00:19:17.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.071 "dma_device_type": 2 00:19:17.071 } 00:19:17.071 ], 00:19:17.071 "driver_specific": {} 00:19:17.071 }' 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.071 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:17.330 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:17.590 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:17.590 "name": "BaseBdev2", 00:19:17.590 "aliases": [ 00:19:17.590 "b95ec86a-976f-40ea-8385-6eb0e08072bb" 00:19:17.590 ], 00:19:17.590 "product_name": "Malloc disk", 00:19:17.590 "block_size": 512, 00:19:17.590 "num_blocks": 65536, 00:19:17.590 "uuid": "b95ec86a-976f-40ea-8385-6eb0e08072bb", 00:19:17.590 "assigned_rate_limits": { 00:19:17.590 "rw_ios_per_sec": 0, 00:19:17.590 "rw_mbytes_per_sec": 0, 00:19:17.590 "r_mbytes_per_sec": 0, 00:19:17.590 "w_mbytes_per_sec": 0 00:19:17.590 }, 00:19:17.590 "claimed": true, 00:19:17.590 "claim_type": "exclusive_write", 00:19:17.590 "zoned": false, 00:19:17.590 "supported_io_types": { 00:19:17.590 "read": true, 00:19:17.590 "write": true, 00:19:17.590 "unmap": true, 00:19:17.590 "flush": true, 00:19:17.590 "reset": true, 00:19:17.590 "nvme_admin": false, 00:19:17.590 "nvme_io": false, 00:19:17.590 "nvme_io_md": false, 00:19:17.590 "write_zeroes": true, 00:19:17.590 "zcopy": true, 00:19:17.590 "get_zone_info": false, 00:19:17.590 "zone_management": false, 00:19:17.590 "zone_append": false, 00:19:17.590 "compare": false, 00:19:17.590 "compare_and_write": false, 00:19:17.590 "abort": true, 00:19:17.590 "seek_hole": false, 00:19:17.590 "seek_data": false, 00:19:17.590 "copy": true, 00:19:17.590 "nvme_iov_md": false 00:19:17.590 }, 00:19:17.590 "memory_domains": [ 00:19:17.590 { 00:19:17.590 "dma_device_id": "system", 00:19:17.590 "dma_device_type": 1 00:19:17.590 }, 00:19:17.590 { 00:19:17.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.590 "dma_device_type": 2 00:19:17.591 } 00:19:17.591 ], 00:19:17.591 "driver_specific": {} 00:19:17.591 }' 00:19:17.591 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.591 07:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:17.591 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:17.591 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.591 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:17.591 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:17.591 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:17.850 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:18.109 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:18.109 "name": "BaseBdev3", 00:19:18.109 "aliases": [ 00:19:18.109 "281a6409-b60b-4049-9c0e-537a401cd138" 00:19:18.109 ], 00:19:18.109 "product_name": "Malloc disk", 00:19:18.109 "block_size": 512, 00:19:18.109 "num_blocks": 65536, 00:19:18.109 "uuid": "281a6409-b60b-4049-9c0e-537a401cd138", 00:19:18.109 "assigned_rate_limits": { 00:19:18.109 "rw_ios_per_sec": 0, 00:19:18.109 "rw_mbytes_per_sec": 0, 00:19:18.109 "r_mbytes_per_sec": 0, 00:19:18.109 "w_mbytes_per_sec": 0 00:19:18.109 }, 00:19:18.109 "claimed": true, 00:19:18.109 "claim_type": "exclusive_write", 00:19:18.109 "zoned": false, 00:19:18.109 "supported_io_types": { 00:19:18.109 "read": true, 00:19:18.109 "write": true, 00:19:18.109 "unmap": true, 00:19:18.109 "flush": true, 00:19:18.109 "reset": true, 00:19:18.109 "nvme_admin": false, 00:19:18.109 "nvme_io": false, 00:19:18.109 "nvme_io_md": false, 00:19:18.109 "write_zeroes": true, 00:19:18.109 "zcopy": true, 00:19:18.109 "get_zone_info": false, 00:19:18.109 "zone_management": false, 00:19:18.109 "zone_append": false, 00:19:18.109 "compare": false, 00:19:18.109 "compare_and_write": false, 00:19:18.109 "abort": true, 00:19:18.109 "seek_hole": false, 00:19:18.109 "seek_data": false, 00:19:18.109 "copy": true, 00:19:18.109 "nvme_iov_md": false 00:19:18.109 }, 00:19:18.109 "memory_domains": [ 00:19:18.109 { 00:19:18.109 "dma_device_id": "system", 00:19:18.109 "dma_device_type": 1 00:19:18.109 }, 00:19:18.109 { 00:19:18.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.109 "dma_device_type": 2 00:19:18.109 } 00:19:18.109 ], 00:19:18.109 "driver_specific": {} 00:19:18.109 }' 00:19:18.109 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.109 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.109 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:18.110 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.110 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.368 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:18.368 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.368 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.368 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:18.368 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:18.369 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:18.369 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:18.369 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:18.369 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:18.369 07:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:18.628 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:18.628 "name": "BaseBdev4", 00:19:18.628 "aliases": [ 00:19:18.628 "8ec33c5f-bf4c-495d-9095-f393a9bb64c8" 00:19:18.628 ], 00:19:18.628 "product_name": "Malloc disk", 00:19:18.628 "block_size": 512, 00:19:18.628 "num_blocks": 65536, 00:19:18.628 "uuid": "8ec33c5f-bf4c-495d-9095-f393a9bb64c8", 00:19:18.628 "assigned_rate_limits": { 00:19:18.628 "rw_ios_per_sec": 0, 00:19:18.628 "rw_mbytes_per_sec": 0, 00:19:18.628 "r_mbytes_per_sec": 0, 00:19:18.628 "w_mbytes_per_sec": 0 00:19:18.628 }, 00:19:18.628 "claimed": true, 00:19:18.628 "claim_type": "exclusive_write", 00:19:18.628 "zoned": false, 00:19:18.628 "supported_io_types": { 00:19:18.628 "read": true, 00:19:18.628 "write": true, 00:19:18.628 "unmap": true, 00:19:18.628 "flush": true, 00:19:18.628 "reset": true, 00:19:18.628 "nvme_admin": false, 00:19:18.628 "nvme_io": false, 00:19:18.628 "nvme_io_md": false, 00:19:18.628 "write_zeroes": true, 00:19:18.628 "zcopy": true, 00:19:18.628 "get_zone_info": false, 00:19:18.628 "zone_management": false, 00:19:18.628 "zone_append": false, 00:19:18.628 "compare": false, 00:19:18.628 "compare_and_write": false, 00:19:18.628 "abort": true, 00:19:18.628 "seek_hole": false, 00:19:18.628 "seek_data": false, 00:19:18.628 "copy": true, 00:19:18.628 "nvme_iov_md": false 00:19:18.628 }, 00:19:18.628 "memory_domains": [ 00:19:18.628 { 00:19:18.628 "dma_device_id": "system", 00:19:18.628 "dma_device_type": 1 00:19:18.628 }, 00:19:18.628 { 00:19:18.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.628 "dma_device_type": 2 00:19:18.628 } 00:19:18.628 ], 00:19:18.628 "driver_specific": {} 00:19:18.628 }' 00:19:18.628 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.628 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:18.887 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:19.146 [2024-07-25 07:24:51.637428] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:19.146 [2024-07-25 07:24:51.637451] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:19.146 [2024-07-25 07:24:51.637498] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:19.146 [2024-07-25 07:24:51.637555] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:19.146 [2024-07-25 07:24:51.637566] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8d4a30 name Existed_Raid, state offline 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1660520 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1660520 ']' 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1660520 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:19.146 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1660520 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1660520' 00:19:19.406 killing process with pid 1660520 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1660520 00:19:19.406 [2024-07-25 07:24:51.713389] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1660520 00:19:19.406 [2024-07-25 07:24:51.745111] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:19.406 00:19:19.406 real 0m32.305s 00:19:19.406 user 0m59.336s 00:19:19.406 sys 0m5.757s 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:19.406 07:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.406 ************************************ 00:19:19.406 END TEST raid_state_function_test 00:19:19.406 ************************************ 00:19:19.666 07:24:51 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:19:19.666 07:24:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:19:19.666 07:24:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:19.666 07:24:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:19.666 ************************************ 00:19:19.666 START TEST raid_state_function_test_sb 00:19:19.666 ************************************ 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 true 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:19.666 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1666495 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1666495' 00:19:19.667 Process raid pid: 1666495 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1666495 /var/tmp/spdk-raid.sock 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1666495 ']' 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:19.667 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:19.667 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.667 [2024-07-25 07:24:52.058700] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:19:19.667 [2024-07-25 07:24:52.058749] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.0 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.1 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.2 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.3 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.4 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.5 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.6 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:01.7 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.0 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.1 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.2 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.3 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.4 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.5 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.6 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3d:02.7 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.0 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.1 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.2 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.3 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.4 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.5 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.6 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:01.7 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.0 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.1 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.2 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.3 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.4 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.5 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.6 cannot be used 00:19:19.667 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:19.667 EAL: Requested device 0000:3f:02.7 cannot be used 00:19:19.667 [2024-07-25 07:24:52.174778] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:19.927 [2024-07-25 07:24:52.263830] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:19:19.927 [2024-07-25 07:24:52.324912] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:19.927 [2024-07-25 07:24:52.324946] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:20.495 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:20.495 07:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:19:20.495 07:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:20.754 [2024-07-25 07:24:53.183220] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:20.754 [2024-07-25 07:24:53.183258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:20.754 [2024-07-25 07:24:53.183269] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:20.755 [2024-07-25 07:24:53.183279] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:20.755 [2024-07-25 07:24:53.183287] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:20.755 [2024-07-25 07:24:53.183298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:20.755 [2024-07-25 07:24:53.183305] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:20.755 [2024-07-25 07:24:53.183315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.755 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.018 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.018 "name": "Existed_Raid", 00:19:21.018 "uuid": "30083d6a-93b7-41f8-b0ac-afb434ddf699", 00:19:21.018 "strip_size_kb": 64, 00:19:21.018 "state": "configuring", 00:19:21.018 "raid_level": "raid0", 00:19:21.018 "superblock": true, 00:19:21.018 "num_base_bdevs": 4, 00:19:21.018 "num_base_bdevs_discovered": 0, 00:19:21.018 "num_base_bdevs_operational": 4, 00:19:21.018 "base_bdevs_list": [ 00:19:21.018 { 00:19:21.018 "name": "BaseBdev1", 00:19:21.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.018 "is_configured": false, 00:19:21.018 "data_offset": 0, 00:19:21.018 "data_size": 0 00:19:21.018 }, 00:19:21.018 { 00:19:21.018 "name": "BaseBdev2", 00:19:21.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.018 "is_configured": false, 00:19:21.018 "data_offset": 0, 00:19:21.018 "data_size": 0 00:19:21.018 }, 00:19:21.018 { 00:19:21.018 "name": "BaseBdev3", 00:19:21.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.018 "is_configured": false, 00:19:21.018 "data_offset": 0, 00:19:21.018 "data_size": 0 00:19:21.018 }, 00:19:21.018 { 00:19:21.018 "name": "BaseBdev4", 00:19:21.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.018 "is_configured": false, 00:19:21.018 "data_offset": 0, 00:19:21.018 "data_size": 0 00:19:21.018 } 00:19:21.018 ] 00:19:21.018 }' 00:19:21.018 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.018 07:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.589 07:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:21.849 [2024-07-25 07:24:54.205763] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:21.849 [2024-07-25 07:24:54.205790] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2511ee0 name Existed_Raid, state configuring 00:19:21.849 07:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:22.108 [2024-07-25 07:24:54.430377] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:22.108 [2024-07-25 07:24:54.430403] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:22.108 [2024-07-25 07:24:54.430413] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:22.108 [2024-07-25 07:24:54.430423] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:22.108 [2024-07-25 07:24:54.430431] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:22.108 [2024-07-25 07:24:54.430441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:22.108 [2024-07-25 07:24:54.430449] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:22.108 [2024-07-25 07:24:54.430459] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:22.108 07:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:22.368 [2024-07-25 07:24:54.668340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:22.368 BaseBdev1 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:22.368 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:22.627 07:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:22.627 [ 00:19:22.627 { 00:19:22.627 "name": "BaseBdev1", 00:19:22.627 "aliases": [ 00:19:22.627 "2608dd13-fb3a-4c36-adbb-8c3d35946332" 00:19:22.627 ], 00:19:22.627 "product_name": "Malloc disk", 00:19:22.627 "block_size": 512, 00:19:22.627 "num_blocks": 65536, 00:19:22.627 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:22.627 "assigned_rate_limits": { 00:19:22.627 "rw_ios_per_sec": 0, 00:19:22.627 "rw_mbytes_per_sec": 0, 00:19:22.627 "r_mbytes_per_sec": 0, 00:19:22.627 "w_mbytes_per_sec": 0 00:19:22.627 }, 00:19:22.627 "claimed": true, 00:19:22.627 "claim_type": "exclusive_write", 00:19:22.627 "zoned": false, 00:19:22.627 "supported_io_types": { 00:19:22.627 "read": true, 00:19:22.627 "write": true, 00:19:22.627 "unmap": true, 00:19:22.627 "flush": true, 00:19:22.627 "reset": true, 00:19:22.627 "nvme_admin": false, 00:19:22.627 "nvme_io": false, 00:19:22.627 "nvme_io_md": false, 00:19:22.627 "write_zeroes": true, 00:19:22.627 "zcopy": true, 00:19:22.627 "get_zone_info": false, 00:19:22.627 "zone_management": false, 00:19:22.627 "zone_append": false, 00:19:22.627 "compare": false, 00:19:22.627 "compare_and_write": false, 00:19:22.627 "abort": true, 00:19:22.627 "seek_hole": false, 00:19:22.627 "seek_data": false, 00:19:22.627 "copy": true, 00:19:22.627 "nvme_iov_md": false 00:19:22.627 }, 00:19:22.627 "memory_domains": [ 00:19:22.627 { 00:19:22.627 "dma_device_id": "system", 00:19:22.627 "dma_device_type": 1 00:19:22.627 }, 00:19:22.627 { 00:19:22.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.627 "dma_device_type": 2 00:19:22.627 } 00:19:22.627 ], 00:19:22.627 "driver_specific": {} 00:19:22.627 } 00:19:22.627 ] 00:19:22.627 07:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:22.627 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:22.627 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.627 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.627 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:22.627 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.628 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.887 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.887 "name": "Existed_Raid", 00:19:22.887 "uuid": "72291655-c04f-4920-8c95-88cc736cf993", 00:19:22.887 "strip_size_kb": 64, 00:19:22.887 "state": "configuring", 00:19:22.887 "raid_level": "raid0", 00:19:22.887 "superblock": true, 00:19:22.887 "num_base_bdevs": 4, 00:19:22.887 "num_base_bdevs_discovered": 1, 00:19:22.887 "num_base_bdevs_operational": 4, 00:19:22.887 "base_bdevs_list": [ 00:19:22.887 { 00:19:22.887 "name": "BaseBdev1", 00:19:22.887 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:22.887 "is_configured": true, 00:19:22.887 "data_offset": 2048, 00:19:22.887 "data_size": 63488 00:19:22.887 }, 00:19:22.887 { 00:19:22.887 "name": "BaseBdev2", 00:19:22.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.887 "is_configured": false, 00:19:22.887 "data_offset": 0, 00:19:22.887 "data_size": 0 00:19:22.887 }, 00:19:22.887 { 00:19:22.887 "name": "BaseBdev3", 00:19:22.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.887 "is_configured": false, 00:19:22.887 "data_offset": 0, 00:19:22.887 "data_size": 0 00:19:22.887 }, 00:19:22.887 { 00:19:22.887 "name": "BaseBdev4", 00:19:22.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.887 "is_configured": false, 00:19:22.887 "data_offset": 0, 00:19:22.887 "data_size": 0 00:19:22.887 } 00:19:22.887 ] 00:19:22.887 }' 00:19:22.887 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.887 07:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.455 07:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:23.714 [2024-07-25 07:24:56.160287] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:23.714 [2024-07-25 07:24:56.160319] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2511750 name Existed_Raid, state configuring 00:19:23.714 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:23.973 [2024-07-25 07:24:56.380903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:23.973 [2024-07-25 07:24:56.382286] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:23.973 [2024-07-25 07:24:56.382315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:23.973 [2024-07-25 07:24:56.382325] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:23.973 [2024-07-25 07:24:56.382335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:23.973 [2024-07-25 07:24:56.382343] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:23.973 [2024-07-25 07:24:56.382353] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.973 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.974 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.974 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.974 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.233 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.233 "name": "Existed_Raid", 00:19:24.233 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:24.233 "strip_size_kb": 64, 00:19:24.233 "state": "configuring", 00:19:24.233 "raid_level": "raid0", 00:19:24.233 "superblock": true, 00:19:24.233 "num_base_bdevs": 4, 00:19:24.233 "num_base_bdevs_discovered": 1, 00:19:24.233 "num_base_bdevs_operational": 4, 00:19:24.233 "base_bdevs_list": [ 00:19:24.233 { 00:19:24.233 "name": "BaseBdev1", 00:19:24.233 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:24.233 "is_configured": true, 00:19:24.233 "data_offset": 2048, 00:19:24.233 "data_size": 63488 00:19:24.233 }, 00:19:24.233 { 00:19:24.233 "name": "BaseBdev2", 00:19:24.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.233 "is_configured": false, 00:19:24.233 "data_offset": 0, 00:19:24.233 "data_size": 0 00:19:24.233 }, 00:19:24.233 { 00:19:24.233 "name": "BaseBdev3", 00:19:24.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.233 "is_configured": false, 00:19:24.233 "data_offset": 0, 00:19:24.233 "data_size": 0 00:19:24.233 }, 00:19:24.233 { 00:19:24.233 "name": "BaseBdev4", 00:19:24.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.233 "is_configured": false, 00:19:24.233 "data_offset": 0, 00:19:24.233 "data_size": 0 00:19:24.233 } 00:19:24.233 ] 00:19:24.233 }' 00:19:24.233 07:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.233 07:24:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:24.801 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:25.060 [2024-07-25 07:24:57.418802] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:25.060 BaseBdev2 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:25.060 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:25.319 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:25.577 [ 00:19:25.577 { 00:19:25.577 "name": "BaseBdev2", 00:19:25.577 "aliases": [ 00:19:25.577 "52019e9e-5555-49b2-ab1d-c5e6bd53a451" 00:19:25.577 ], 00:19:25.577 "product_name": "Malloc disk", 00:19:25.577 "block_size": 512, 00:19:25.577 "num_blocks": 65536, 00:19:25.577 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:25.577 "assigned_rate_limits": { 00:19:25.577 "rw_ios_per_sec": 0, 00:19:25.577 "rw_mbytes_per_sec": 0, 00:19:25.577 "r_mbytes_per_sec": 0, 00:19:25.577 "w_mbytes_per_sec": 0 00:19:25.577 }, 00:19:25.577 "claimed": true, 00:19:25.577 "claim_type": "exclusive_write", 00:19:25.577 "zoned": false, 00:19:25.577 "supported_io_types": { 00:19:25.577 "read": true, 00:19:25.577 "write": true, 00:19:25.577 "unmap": true, 00:19:25.577 "flush": true, 00:19:25.577 "reset": true, 00:19:25.577 "nvme_admin": false, 00:19:25.577 "nvme_io": false, 00:19:25.577 "nvme_io_md": false, 00:19:25.577 "write_zeroes": true, 00:19:25.577 "zcopy": true, 00:19:25.577 "get_zone_info": false, 00:19:25.577 "zone_management": false, 00:19:25.577 "zone_append": false, 00:19:25.577 "compare": false, 00:19:25.577 "compare_and_write": false, 00:19:25.577 "abort": true, 00:19:25.577 "seek_hole": false, 00:19:25.577 "seek_data": false, 00:19:25.577 "copy": true, 00:19:25.577 "nvme_iov_md": false 00:19:25.577 }, 00:19:25.577 "memory_domains": [ 00:19:25.577 { 00:19:25.577 "dma_device_id": "system", 00:19:25.577 "dma_device_type": 1 00:19:25.577 }, 00:19:25.577 { 00:19:25.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.578 "dma_device_type": 2 00:19:25.578 } 00:19:25.578 ], 00:19:25.578 "driver_specific": {} 00:19:25.578 } 00:19:25.578 ] 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.578 07:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.837 07:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.837 "name": "Existed_Raid", 00:19:25.837 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:25.837 "strip_size_kb": 64, 00:19:25.837 "state": "configuring", 00:19:25.837 "raid_level": "raid0", 00:19:25.837 "superblock": true, 00:19:25.837 "num_base_bdevs": 4, 00:19:25.837 "num_base_bdevs_discovered": 2, 00:19:25.837 "num_base_bdevs_operational": 4, 00:19:25.837 "base_bdevs_list": [ 00:19:25.837 { 00:19:25.837 "name": "BaseBdev1", 00:19:25.837 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:25.837 "is_configured": true, 00:19:25.837 "data_offset": 2048, 00:19:25.837 "data_size": 63488 00:19:25.837 }, 00:19:25.837 { 00:19:25.837 "name": "BaseBdev2", 00:19:25.837 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:25.837 "is_configured": true, 00:19:25.837 "data_offset": 2048, 00:19:25.837 "data_size": 63488 00:19:25.837 }, 00:19:25.837 { 00:19:25.837 "name": "BaseBdev3", 00:19:25.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.837 "is_configured": false, 00:19:25.837 "data_offset": 0, 00:19:25.837 "data_size": 0 00:19:25.837 }, 00:19:25.837 { 00:19:25.837 "name": "BaseBdev4", 00:19:25.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.837 "is_configured": false, 00:19:25.837 "data_offset": 0, 00:19:25.837 "data_size": 0 00:19:25.837 } 00:19:25.837 ] 00:19:25.837 }' 00:19:25.837 07:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.837 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:26.405 [2024-07-25 07:24:58.885935] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:26.405 BaseBdev3 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:26.405 07:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:26.664 07:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:26.923 [ 00:19:26.923 { 00:19:26.923 "name": "BaseBdev3", 00:19:26.923 "aliases": [ 00:19:26.923 "ec1ee972-60b0-41c0-a1a4-bec01997f904" 00:19:26.923 ], 00:19:26.923 "product_name": "Malloc disk", 00:19:26.923 "block_size": 512, 00:19:26.923 "num_blocks": 65536, 00:19:26.923 "uuid": "ec1ee972-60b0-41c0-a1a4-bec01997f904", 00:19:26.923 "assigned_rate_limits": { 00:19:26.923 "rw_ios_per_sec": 0, 00:19:26.923 "rw_mbytes_per_sec": 0, 00:19:26.923 "r_mbytes_per_sec": 0, 00:19:26.923 "w_mbytes_per_sec": 0 00:19:26.923 }, 00:19:26.923 "claimed": true, 00:19:26.923 "claim_type": "exclusive_write", 00:19:26.923 "zoned": false, 00:19:26.923 "supported_io_types": { 00:19:26.923 "read": true, 00:19:26.923 "write": true, 00:19:26.923 "unmap": true, 00:19:26.923 "flush": true, 00:19:26.923 "reset": true, 00:19:26.923 "nvme_admin": false, 00:19:26.923 "nvme_io": false, 00:19:26.923 "nvme_io_md": false, 00:19:26.923 "write_zeroes": true, 00:19:26.923 "zcopy": true, 00:19:26.923 "get_zone_info": false, 00:19:26.923 "zone_management": false, 00:19:26.923 "zone_append": false, 00:19:26.923 "compare": false, 00:19:26.923 "compare_and_write": false, 00:19:26.923 "abort": true, 00:19:26.923 "seek_hole": false, 00:19:26.923 "seek_data": false, 00:19:26.923 "copy": true, 00:19:26.923 "nvme_iov_md": false 00:19:26.923 }, 00:19:26.923 "memory_domains": [ 00:19:26.923 { 00:19:26.923 "dma_device_id": "system", 00:19:26.923 "dma_device_type": 1 00:19:26.923 }, 00:19:26.923 { 00:19:26.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.923 "dma_device_type": 2 00:19:26.923 } 00:19:26.923 ], 00:19:26.923 "driver_specific": {} 00:19:26.923 } 00:19:26.923 ] 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.923 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.182 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.182 "name": "Existed_Raid", 00:19:27.182 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:27.182 "strip_size_kb": 64, 00:19:27.182 "state": "configuring", 00:19:27.182 "raid_level": "raid0", 00:19:27.182 "superblock": true, 00:19:27.182 "num_base_bdevs": 4, 00:19:27.182 "num_base_bdevs_discovered": 3, 00:19:27.182 "num_base_bdevs_operational": 4, 00:19:27.182 "base_bdevs_list": [ 00:19:27.182 { 00:19:27.182 "name": "BaseBdev1", 00:19:27.182 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:27.182 "is_configured": true, 00:19:27.182 "data_offset": 2048, 00:19:27.182 "data_size": 63488 00:19:27.182 }, 00:19:27.182 { 00:19:27.182 "name": "BaseBdev2", 00:19:27.182 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:27.182 "is_configured": true, 00:19:27.182 "data_offset": 2048, 00:19:27.182 "data_size": 63488 00:19:27.182 }, 00:19:27.182 { 00:19:27.182 "name": "BaseBdev3", 00:19:27.182 "uuid": "ec1ee972-60b0-41c0-a1a4-bec01997f904", 00:19:27.182 "is_configured": true, 00:19:27.182 "data_offset": 2048, 00:19:27.182 "data_size": 63488 00:19:27.182 }, 00:19:27.182 { 00:19:27.182 "name": "BaseBdev4", 00:19:27.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.182 "is_configured": false, 00:19:27.182 "data_offset": 0, 00:19:27.182 "data_size": 0 00:19:27.182 } 00:19:27.182 ] 00:19:27.182 }' 00:19:27.182 07:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.182 07:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:27.750 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:28.009 [2024-07-25 07:25:00.356998] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:28.009 [2024-07-25 07:25:00.357163] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25127b0 00:19:28.009 [2024-07-25 07:25:00.357176] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:28.009 [2024-07-25 07:25:00.357337] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c59d0 00:19:28.009 [2024-07-25 07:25:00.357455] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25127b0 00:19:28.009 [2024-07-25 07:25:00.357464] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25127b0 00:19:28.009 [2024-07-25 07:25:00.357548] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.009 BaseBdev4 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:28.009 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.268 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:28.611 [ 00:19:28.611 { 00:19:28.611 "name": "BaseBdev4", 00:19:28.611 "aliases": [ 00:19:28.611 "a705d8e8-b475-4de0-b84f-24843ea671a5" 00:19:28.611 ], 00:19:28.611 "product_name": "Malloc disk", 00:19:28.611 "block_size": 512, 00:19:28.611 "num_blocks": 65536, 00:19:28.611 "uuid": "a705d8e8-b475-4de0-b84f-24843ea671a5", 00:19:28.611 "assigned_rate_limits": { 00:19:28.611 "rw_ios_per_sec": 0, 00:19:28.611 "rw_mbytes_per_sec": 0, 00:19:28.611 "r_mbytes_per_sec": 0, 00:19:28.611 "w_mbytes_per_sec": 0 00:19:28.611 }, 00:19:28.611 "claimed": true, 00:19:28.611 "claim_type": "exclusive_write", 00:19:28.611 "zoned": false, 00:19:28.611 "supported_io_types": { 00:19:28.611 "read": true, 00:19:28.611 "write": true, 00:19:28.611 "unmap": true, 00:19:28.611 "flush": true, 00:19:28.611 "reset": true, 00:19:28.611 "nvme_admin": false, 00:19:28.611 "nvme_io": false, 00:19:28.611 "nvme_io_md": false, 00:19:28.611 "write_zeroes": true, 00:19:28.611 "zcopy": true, 00:19:28.611 "get_zone_info": false, 00:19:28.611 "zone_management": false, 00:19:28.611 "zone_append": false, 00:19:28.611 "compare": false, 00:19:28.611 "compare_and_write": false, 00:19:28.611 "abort": true, 00:19:28.611 "seek_hole": false, 00:19:28.611 "seek_data": false, 00:19:28.611 "copy": true, 00:19:28.611 "nvme_iov_md": false 00:19:28.611 }, 00:19:28.611 "memory_domains": [ 00:19:28.611 { 00:19:28.611 "dma_device_id": "system", 00:19:28.611 "dma_device_type": 1 00:19:28.611 }, 00:19:28.611 { 00:19:28.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.611 "dma_device_type": 2 00:19:28.611 } 00:19:28.611 ], 00:19:28.611 "driver_specific": {} 00:19:28.611 } 00:19:28.611 ] 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.611 07:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:28.611 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.611 "name": "Existed_Raid", 00:19:28.611 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:28.611 "strip_size_kb": 64, 00:19:28.611 "state": "online", 00:19:28.611 "raid_level": "raid0", 00:19:28.611 "superblock": true, 00:19:28.611 "num_base_bdevs": 4, 00:19:28.611 "num_base_bdevs_discovered": 4, 00:19:28.611 "num_base_bdevs_operational": 4, 00:19:28.611 "base_bdevs_list": [ 00:19:28.611 { 00:19:28.611 "name": "BaseBdev1", 00:19:28.611 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:28.611 "is_configured": true, 00:19:28.611 "data_offset": 2048, 00:19:28.611 "data_size": 63488 00:19:28.611 }, 00:19:28.611 { 00:19:28.611 "name": "BaseBdev2", 00:19:28.611 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:28.611 "is_configured": true, 00:19:28.611 "data_offset": 2048, 00:19:28.611 "data_size": 63488 00:19:28.611 }, 00:19:28.611 { 00:19:28.611 "name": "BaseBdev3", 00:19:28.611 "uuid": "ec1ee972-60b0-41c0-a1a4-bec01997f904", 00:19:28.611 "is_configured": true, 00:19:28.611 "data_offset": 2048, 00:19:28.611 "data_size": 63488 00:19:28.611 }, 00:19:28.611 { 00:19:28.611 "name": "BaseBdev4", 00:19:28.611 "uuid": "a705d8e8-b475-4de0-b84f-24843ea671a5", 00:19:28.611 "is_configured": true, 00:19:28.611 "data_offset": 2048, 00:19:28.611 "data_size": 63488 00:19:28.611 } 00:19:28.611 ] 00:19:28.611 }' 00:19:28.611 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.611 07:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:29.178 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:29.179 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:29.437 [2024-07-25 07:25:01.797123] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:29.437 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:29.437 "name": "Existed_Raid", 00:19:29.437 "aliases": [ 00:19:29.437 "3b04b2e0-c475-4c71-ac69-270b3cd854e9" 00:19:29.437 ], 00:19:29.437 "product_name": "Raid Volume", 00:19:29.437 "block_size": 512, 00:19:29.437 "num_blocks": 253952, 00:19:29.437 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:29.437 "assigned_rate_limits": { 00:19:29.437 "rw_ios_per_sec": 0, 00:19:29.437 "rw_mbytes_per_sec": 0, 00:19:29.437 "r_mbytes_per_sec": 0, 00:19:29.437 "w_mbytes_per_sec": 0 00:19:29.437 }, 00:19:29.437 "claimed": false, 00:19:29.437 "zoned": false, 00:19:29.437 "supported_io_types": { 00:19:29.437 "read": true, 00:19:29.437 "write": true, 00:19:29.437 "unmap": true, 00:19:29.437 "flush": true, 00:19:29.437 "reset": true, 00:19:29.437 "nvme_admin": false, 00:19:29.437 "nvme_io": false, 00:19:29.437 "nvme_io_md": false, 00:19:29.437 "write_zeroes": true, 00:19:29.437 "zcopy": false, 00:19:29.437 "get_zone_info": false, 00:19:29.437 "zone_management": false, 00:19:29.437 "zone_append": false, 00:19:29.437 "compare": false, 00:19:29.437 "compare_and_write": false, 00:19:29.437 "abort": false, 00:19:29.437 "seek_hole": false, 00:19:29.437 "seek_data": false, 00:19:29.437 "copy": false, 00:19:29.437 "nvme_iov_md": false 00:19:29.437 }, 00:19:29.438 "memory_domains": [ 00:19:29.438 { 00:19:29.438 "dma_device_id": "system", 00:19:29.438 "dma_device_type": 1 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.438 "dma_device_type": 2 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "system", 00:19:29.438 "dma_device_type": 1 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.438 "dma_device_type": 2 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "system", 00:19:29.438 "dma_device_type": 1 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.438 "dma_device_type": 2 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "system", 00:19:29.438 "dma_device_type": 1 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.438 "dma_device_type": 2 00:19:29.438 } 00:19:29.438 ], 00:19:29.438 "driver_specific": { 00:19:29.438 "raid": { 00:19:29.438 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:29.438 "strip_size_kb": 64, 00:19:29.438 "state": "online", 00:19:29.438 "raid_level": "raid0", 00:19:29.438 "superblock": true, 00:19:29.438 "num_base_bdevs": 4, 00:19:29.438 "num_base_bdevs_discovered": 4, 00:19:29.438 "num_base_bdevs_operational": 4, 00:19:29.438 "base_bdevs_list": [ 00:19:29.438 { 00:19:29.438 "name": "BaseBdev1", 00:19:29.438 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:29.438 "is_configured": true, 00:19:29.438 "data_offset": 2048, 00:19:29.438 "data_size": 63488 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "name": "BaseBdev2", 00:19:29.438 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:29.438 "is_configured": true, 00:19:29.438 "data_offset": 2048, 00:19:29.438 "data_size": 63488 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "name": "BaseBdev3", 00:19:29.438 "uuid": "ec1ee972-60b0-41c0-a1a4-bec01997f904", 00:19:29.438 "is_configured": true, 00:19:29.438 "data_offset": 2048, 00:19:29.438 "data_size": 63488 00:19:29.438 }, 00:19:29.438 { 00:19:29.438 "name": "BaseBdev4", 00:19:29.438 "uuid": "a705d8e8-b475-4de0-b84f-24843ea671a5", 00:19:29.438 "is_configured": true, 00:19:29.438 "data_offset": 2048, 00:19:29.438 "data_size": 63488 00:19:29.438 } 00:19:29.438 ] 00:19:29.438 } 00:19:29.438 } 00:19:29.438 }' 00:19:29.438 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:29.438 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:29.438 BaseBdev2 00:19:29.438 BaseBdev3 00:19:29.438 BaseBdev4' 00:19:29.438 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.438 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:29.438 07:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.697 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.697 "name": "BaseBdev1", 00:19:29.697 "aliases": [ 00:19:29.697 "2608dd13-fb3a-4c36-adbb-8c3d35946332" 00:19:29.697 ], 00:19:29.697 "product_name": "Malloc disk", 00:19:29.697 "block_size": 512, 00:19:29.697 "num_blocks": 65536, 00:19:29.697 "uuid": "2608dd13-fb3a-4c36-adbb-8c3d35946332", 00:19:29.697 "assigned_rate_limits": { 00:19:29.697 "rw_ios_per_sec": 0, 00:19:29.697 "rw_mbytes_per_sec": 0, 00:19:29.697 "r_mbytes_per_sec": 0, 00:19:29.697 "w_mbytes_per_sec": 0 00:19:29.697 }, 00:19:29.697 "claimed": true, 00:19:29.697 "claim_type": "exclusive_write", 00:19:29.697 "zoned": false, 00:19:29.697 "supported_io_types": { 00:19:29.697 "read": true, 00:19:29.697 "write": true, 00:19:29.697 "unmap": true, 00:19:29.697 "flush": true, 00:19:29.697 "reset": true, 00:19:29.697 "nvme_admin": false, 00:19:29.697 "nvme_io": false, 00:19:29.697 "nvme_io_md": false, 00:19:29.697 "write_zeroes": true, 00:19:29.697 "zcopy": true, 00:19:29.697 "get_zone_info": false, 00:19:29.697 "zone_management": false, 00:19:29.697 "zone_append": false, 00:19:29.697 "compare": false, 00:19:29.697 "compare_and_write": false, 00:19:29.697 "abort": true, 00:19:29.697 "seek_hole": false, 00:19:29.697 "seek_data": false, 00:19:29.697 "copy": true, 00:19:29.697 "nvme_iov_md": false 00:19:29.697 }, 00:19:29.697 "memory_domains": [ 00:19:29.697 { 00:19:29.697 "dma_device_id": "system", 00:19:29.697 "dma_device_type": 1 00:19:29.697 }, 00:19:29.697 { 00:19:29.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.697 "dma_device_type": 2 00:19:29.697 } 00:19:29.697 ], 00:19:29.697 "driver_specific": {} 00:19:29.697 }' 00:19:29.697 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.697 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.697 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.697 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.697 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.955 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.955 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.955 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.955 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.956 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.956 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.956 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.956 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.956 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:29.956 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.214 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.214 "name": "BaseBdev2", 00:19:30.214 "aliases": [ 00:19:30.214 "52019e9e-5555-49b2-ab1d-c5e6bd53a451" 00:19:30.214 ], 00:19:30.214 "product_name": "Malloc disk", 00:19:30.214 "block_size": 512, 00:19:30.214 "num_blocks": 65536, 00:19:30.214 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:30.214 "assigned_rate_limits": { 00:19:30.214 "rw_ios_per_sec": 0, 00:19:30.214 "rw_mbytes_per_sec": 0, 00:19:30.214 "r_mbytes_per_sec": 0, 00:19:30.214 "w_mbytes_per_sec": 0 00:19:30.214 }, 00:19:30.214 "claimed": true, 00:19:30.214 "claim_type": "exclusive_write", 00:19:30.214 "zoned": false, 00:19:30.214 "supported_io_types": { 00:19:30.214 "read": true, 00:19:30.214 "write": true, 00:19:30.214 "unmap": true, 00:19:30.214 "flush": true, 00:19:30.214 "reset": true, 00:19:30.214 "nvme_admin": false, 00:19:30.214 "nvme_io": false, 00:19:30.214 "nvme_io_md": false, 00:19:30.214 "write_zeroes": true, 00:19:30.214 "zcopy": true, 00:19:30.214 "get_zone_info": false, 00:19:30.214 "zone_management": false, 00:19:30.214 "zone_append": false, 00:19:30.214 "compare": false, 00:19:30.214 "compare_and_write": false, 00:19:30.214 "abort": true, 00:19:30.214 "seek_hole": false, 00:19:30.214 "seek_data": false, 00:19:30.214 "copy": true, 00:19:30.214 "nvme_iov_md": false 00:19:30.214 }, 00:19:30.214 "memory_domains": [ 00:19:30.214 { 00:19:30.214 "dma_device_id": "system", 00:19:30.214 "dma_device_type": 1 00:19:30.214 }, 00:19:30.214 { 00:19:30.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.214 "dma_device_type": 2 00:19:30.214 } 00:19:30.214 ], 00:19:30.214 "driver_specific": {} 00:19:30.214 }' 00:19:30.214 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.214 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.474 07:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.474 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.474 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.732 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:30.732 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.732 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.732 "name": "BaseBdev3", 00:19:30.732 "aliases": [ 00:19:30.732 "ec1ee972-60b0-41c0-a1a4-bec01997f904" 00:19:30.732 ], 00:19:30.732 "product_name": "Malloc disk", 00:19:30.732 "block_size": 512, 00:19:30.732 "num_blocks": 65536, 00:19:30.732 "uuid": "ec1ee972-60b0-41c0-a1a4-bec01997f904", 00:19:30.732 "assigned_rate_limits": { 00:19:30.732 "rw_ios_per_sec": 0, 00:19:30.732 "rw_mbytes_per_sec": 0, 00:19:30.732 "r_mbytes_per_sec": 0, 00:19:30.732 "w_mbytes_per_sec": 0 00:19:30.732 }, 00:19:30.732 "claimed": true, 00:19:30.732 "claim_type": "exclusive_write", 00:19:30.733 "zoned": false, 00:19:30.733 "supported_io_types": { 00:19:30.733 "read": true, 00:19:30.733 "write": true, 00:19:30.733 "unmap": true, 00:19:30.733 "flush": true, 00:19:30.733 "reset": true, 00:19:30.733 "nvme_admin": false, 00:19:30.733 "nvme_io": false, 00:19:30.733 "nvme_io_md": false, 00:19:30.733 "write_zeroes": true, 00:19:30.733 "zcopy": true, 00:19:30.733 "get_zone_info": false, 00:19:30.733 "zone_management": false, 00:19:30.733 "zone_append": false, 00:19:30.733 "compare": false, 00:19:30.733 "compare_and_write": false, 00:19:30.733 "abort": true, 00:19:30.733 "seek_hole": false, 00:19:30.733 "seek_data": false, 00:19:30.733 "copy": true, 00:19:30.733 "nvme_iov_md": false 00:19:30.733 }, 00:19:30.733 "memory_domains": [ 00:19:30.733 { 00:19:30.733 "dma_device_id": "system", 00:19:30.733 "dma_device_type": 1 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.733 "dma_device_type": 2 00:19:30.733 } 00:19:30.733 ], 00:19:30.733 "driver_specific": {} 00:19:30.733 }' 00:19:30.733 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.994 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.252 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.252 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.252 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:31.252 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:31.252 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:31.511 "name": "BaseBdev4", 00:19:31.511 "aliases": [ 00:19:31.511 "a705d8e8-b475-4de0-b84f-24843ea671a5" 00:19:31.511 ], 00:19:31.511 "product_name": "Malloc disk", 00:19:31.511 "block_size": 512, 00:19:31.511 "num_blocks": 65536, 00:19:31.511 "uuid": "a705d8e8-b475-4de0-b84f-24843ea671a5", 00:19:31.511 "assigned_rate_limits": { 00:19:31.511 "rw_ios_per_sec": 0, 00:19:31.511 "rw_mbytes_per_sec": 0, 00:19:31.511 "r_mbytes_per_sec": 0, 00:19:31.511 "w_mbytes_per_sec": 0 00:19:31.511 }, 00:19:31.511 "claimed": true, 00:19:31.511 "claim_type": "exclusive_write", 00:19:31.511 "zoned": false, 00:19:31.511 "supported_io_types": { 00:19:31.511 "read": true, 00:19:31.511 "write": true, 00:19:31.511 "unmap": true, 00:19:31.511 "flush": true, 00:19:31.511 "reset": true, 00:19:31.511 "nvme_admin": false, 00:19:31.511 "nvme_io": false, 00:19:31.511 "nvme_io_md": false, 00:19:31.511 "write_zeroes": true, 00:19:31.511 "zcopy": true, 00:19:31.511 "get_zone_info": false, 00:19:31.511 "zone_management": false, 00:19:31.511 "zone_append": false, 00:19:31.511 "compare": false, 00:19:31.511 "compare_and_write": false, 00:19:31.511 "abort": true, 00:19:31.511 "seek_hole": false, 00:19:31.511 "seek_data": false, 00:19:31.511 "copy": true, 00:19:31.511 "nvme_iov_md": false 00:19:31.511 }, 00:19:31.511 "memory_domains": [ 00:19:31.511 { 00:19:31.511 "dma_device_id": "system", 00:19:31.511 "dma_device_type": 1 00:19:31.511 }, 00:19:31.511 { 00:19:31.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.511 "dma_device_type": 2 00:19:31.511 } 00:19:31.511 ], 00:19:31.511 "driver_specific": {} 00:19:31.511 }' 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:31.511 07:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.511 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.770 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.770 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.770 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.770 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.770 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:32.029 [2024-07-25 07:25:04.367641] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:32.029 [2024-07-25 07:25:04.367665] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:32.029 [2024-07-25 07:25:04.367708] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.029 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.288 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.288 "name": "Existed_Raid", 00:19:32.288 "uuid": "3b04b2e0-c475-4c71-ac69-270b3cd854e9", 00:19:32.288 "strip_size_kb": 64, 00:19:32.288 "state": "offline", 00:19:32.288 "raid_level": "raid0", 00:19:32.288 "superblock": true, 00:19:32.288 "num_base_bdevs": 4, 00:19:32.288 "num_base_bdevs_discovered": 3, 00:19:32.288 "num_base_bdevs_operational": 3, 00:19:32.288 "base_bdevs_list": [ 00:19:32.288 { 00:19:32.288 "name": null, 00:19:32.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.288 "is_configured": false, 00:19:32.288 "data_offset": 2048, 00:19:32.288 "data_size": 63488 00:19:32.288 }, 00:19:32.288 { 00:19:32.288 "name": "BaseBdev2", 00:19:32.288 "uuid": "52019e9e-5555-49b2-ab1d-c5e6bd53a451", 00:19:32.288 "is_configured": true, 00:19:32.288 "data_offset": 2048, 00:19:32.288 "data_size": 63488 00:19:32.288 }, 00:19:32.288 { 00:19:32.288 "name": "BaseBdev3", 00:19:32.288 "uuid": "ec1ee972-60b0-41c0-a1a4-bec01997f904", 00:19:32.288 "is_configured": true, 00:19:32.288 "data_offset": 2048, 00:19:32.288 "data_size": 63488 00:19:32.288 }, 00:19:32.288 { 00:19:32.288 "name": "BaseBdev4", 00:19:32.288 "uuid": "a705d8e8-b475-4de0-b84f-24843ea671a5", 00:19:32.288 "is_configured": true, 00:19:32.288 "data_offset": 2048, 00:19:32.288 "data_size": 63488 00:19:32.288 } 00:19:32.288 ] 00:19:32.288 }' 00:19:32.288 07:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.288 07:25:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:32.856 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:32.856 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:32.856 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:32.856 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.115 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:33.115 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:33.115 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:33.115 [2024-07-25 07:25:05.632037] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:33.373 07:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:33.632 [2024-07-25 07:25:06.099454] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:33.632 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:33.632 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:33.632 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.632 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:33.891 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:33.891 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:33.891 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:34.150 [2024-07-25 07:25:06.562840] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:34.150 [2024-07-25 07:25:06.562877] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25127b0 name Existed_Raid, state offline 00:19:34.150 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:34.150 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:34.150 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.150 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:34.409 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:34.409 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:34.409 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:34.409 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:34.409 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:34.409 07:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:34.668 BaseBdev2 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:34.668 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:34.927 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:35.186 [ 00:19:35.186 { 00:19:35.186 "name": "BaseBdev2", 00:19:35.186 "aliases": [ 00:19:35.186 "d82763ac-c687-4a38-8b6b-16e2fed94ac0" 00:19:35.186 ], 00:19:35.186 "product_name": "Malloc disk", 00:19:35.186 "block_size": 512, 00:19:35.186 "num_blocks": 65536, 00:19:35.186 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:35.186 "assigned_rate_limits": { 00:19:35.186 "rw_ios_per_sec": 0, 00:19:35.186 "rw_mbytes_per_sec": 0, 00:19:35.186 "r_mbytes_per_sec": 0, 00:19:35.186 "w_mbytes_per_sec": 0 00:19:35.186 }, 00:19:35.186 "claimed": false, 00:19:35.186 "zoned": false, 00:19:35.186 "supported_io_types": { 00:19:35.186 "read": true, 00:19:35.186 "write": true, 00:19:35.186 "unmap": true, 00:19:35.186 "flush": true, 00:19:35.186 "reset": true, 00:19:35.186 "nvme_admin": false, 00:19:35.186 "nvme_io": false, 00:19:35.186 "nvme_io_md": false, 00:19:35.186 "write_zeroes": true, 00:19:35.186 "zcopy": true, 00:19:35.186 "get_zone_info": false, 00:19:35.186 "zone_management": false, 00:19:35.186 "zone_append": false, 00:19:35.186 "compare": false, 00:19:35.186 "compare_and_write": false, 00:19:35.186 "abort": true, 00:19:35.186 "seek_hole": false, 00:19:35.186 "seek_data": false, 00:19:35.186 "copy": true, 00:19:35.186 "nvme_iov_md": false 00:19:35.186 }, 00:19:35.186 "memory_domains": [ 00:19:35.186 { 00:19:35.186 "dma_device_id": "system", 00:19:35.186 "dma_device_type": 1 00:19:35.186 }, 00:19:35.186 { 00:19:35.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.186 "dma_device_type": 2 00:19:35.186 } 00:19:35.186 ], 00:19:35.186 "driver_specific": {} 00:19:35.186 } 00:19:35.186 ] 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:35.186 BaseBdev3 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:35.186 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:35.444 07:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:35.702 [ 00:19:35.703 { 00:19:35.703 "name": "BaseBdev3", 00:19:35.703 "aliases": [ 00:19:35.703 "ff1c324d-a853-44f4-9636-247b9651e9eb" 00:19:35.703 ], 00:19:35.703 "product_name": "Malloc disk", 00:19:35.703 "block_size": 512, 00:19:35.703 "num_blocks": 65536, 00:19:35.703 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:35.703 "assigned_rate_limits": { 00:19:35.703 "rw_ios_per_sec": 0, 00:19:35.703 "rw_mbytes_per_sec": 0, 00:19:35.703 "r_mbytes_per_sec": 0, 00:19:35.703 "w_mbytes_per_sec": 0 00:19:35.703 }, 00:19:35.703 "claimed": false, 00:19:35.703 "zoned": false, 00:19:35.703 "supported_io_types": { 00:19:35.703 "read": true, 00:19:35.703 "write": true, 00:19:35.703 "unmap": true, 00:19:35.703 "flush": true, 00:19:35.703 "reset": true, 00:19:35.703 "nvme_admin": false, 00:19:35.703 "nvme_io": false, 00:19:35.703 "nvme_io_md": false, 00:19:35.703 "write_zeroes": true, 00:19:35.703 "zcopy": true, 00:19:35.703 "get_zone_info": false, 00:19:35.703 "zone_management": false, 00:19:35.703 "zone_append": false, 00:19:35.703 "compare": false, 00:19:35.703 "compare_and_write": false, 00:19:35.703 "abort": true, 00:19:35.703 "seek_hole": false, 00:19:35.703 "seek_data": false, 00:19:35.703 "copy": true, 00:19:35.703 "nvme_iov_md": false 00:19:35.703 }, 00:19:35.703 "memory_domains": [ 00:19:35.703 { 00:19:35.703 "dma_device_id": "system", 00:19:35.703 "dma_device_type": 1 00:19:35.703 }, 00:19:35.703 { 00:19:35.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.703 "dma_device_type": 2 00:19:35.703 } 00:19:35.703 ], 00:19:35.703 "driver_specific": {} 00:19:35.703 } 00:19:35.703 ] 00:19:35.703 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:35.703 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:35.703 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:35.703 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:35.962 BaseBdev4 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:35.962 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:36.221 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:36.479 [ 00:19:36.479 { 00:19:36.479 "name": "BaseBdev4", 00:19:36.479 "aliases": [ 00:19:36.479 "679442e0-2724-4c86-94e8-df0cd74e1558" 00:19:36.479 ], 00:19:36.479 "product_name": "Malloc disk", 00:19:36.480 "block_size": 512, 00:19:36.480 "num_blocks": 65536, 00:19:36.480 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:36.480 "assigned_rate_limits": { 00:19:36.480 "rw_ios_per_sec": 0, 00:19:36.480 "rw_mbytes_per_sec": 0, 00:19:36.480 "r_mbytes_per_sec": 0, 00:19:36.480 "w_mbytes_per_sec": 0 00:19:36.480 }, 00:19:36.480 "claimed": false, 00:19:36.480 "zoned": false, 00:19:36.480 "supported_io_types": { 00:19:36.480 "read": true, 00:19:36.480 "write": true, 00:19:36.480 "unmap": true, 00:19:36.480 "flush": true, 00:19:36.480 "reset": true, 00:19:36.480 "nvme_admin": false, 00:19:36.480 "nvme_io": false, 00:19:36.480 "nvme_io_md": false, 00:19:36.480 "write_zeroes": true, 00:19:36.480 "zcopy": true, 00:19:36.480 "get_zone_info": false, 00:19:36.480 "zone_management": false, 00:19:36.480 "zone_append": false, 00:19:36.480 "compare": false, 00:19:36.480 "compare_and_write": false, 00:19:36.480 "abort": true, 00:19:36.480 "seek_hole": false, 00:19:36.480 "seek_data": false, 00:19:36.480 "copy": true, 00:19:36.480 "nvme_iov_md": false 00:19:36.480 }, 00:19:36.480 "memory_domains": [ 00:19:36.480 { 00:19:36.480 "dma_device_id": "system", 00:19:36.480 "dma_device_type": 1 00:19:36.480 }, 00:19:36.480 { 00:19:36.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.480 "dma_device_type": 2 00:19:36.480 } 00:19:36.480 ], 00:19:36.480 "driver_specific": {} 00:19:36.480 } 00:19:36.480 ] 00:19:36.480 07:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:36.480 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:36.480 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:36.480 07:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:36.739 [2024-07-25 07:25:09.016315] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:36.739 [2024-07-25 07:25:09.016351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:36.739 [2024-07-25 07:25:09.016371] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:36.739 [2024-07-25 07:25:09.017596] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:36.739 [2024-07-25 07:25:09.017634] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.739 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.739 "name": "Existed_Raid", 00:19:36.739 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:36.739 "strip_size_kb": 64, 00:19:36.739 "state": "configuring", 00:19:36.739 "raid_level": "raid0", 00:19:36.739 "superblock": true, 00:19:36.740 "num_base_bdevs": 4, 00:19:36.740 "num_base_bdevs_discovered": 3, 00:19:36.740 "num_base_bdevs_operational": 4, 00:19:36.740 "base_bdevs_list": [ 00:19:36.740 { 00:19:36.740 "name": "BaseBdev1", 00:19:36.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.740 "is_configured": false, 00:19:36.740 "data_offset": 0, 00:19:36.740 "data_size": 0 00:19:36.740 }, 00:19:36.740 { 00:19:36.740 "name": "BaseBdev2", 00:19:36.740 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:36.740 "is_configured": true, 00:19:36.740 "data_offset": 2048, 00:19:36.740 "data_size": 63488 00:19:36.740 }, 00:19:36.740 { 00:19:36.740 "name": "BaseBdev3", 00:19:36.740 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:36.740 "is_configured": true, 00:19:36.740 "data_offset": 2048, 00:19:36.740 "data_size": 63488 00:19:36.740 }, 00:19:36.740 { 00:19:36.740 "name": "BaseBdev4", 00:19:36.740 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:36.740 "is_configured": true, 00:19:36.740 "data_offset": 2048, 00:19:36.740 "data_size": 63488 00:19:36.740 } 00:19:36.740 ] 00:19:36.740 }' 00:19:36.740 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.740 07:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:37.307 07:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:37.566 [2024-07-25 07:25:10.047000] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.566 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.825 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.826 "name": "Existed_Raid", 00:19:37.826 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:37.826 "strip_size_kb": 64, 00:19:37.826 "state": "configuring", 00:19:37.826 "raid_level": "raid0", 00:19:37.826 "superblock": true, 00:19:37.826 "num_base_bdevs": 4, 00:19:37.826 "num_base_bdevs_discovered": 2, 00:19:37.826 "num_base_bdevs_operational": 4, 00:19:37.826 "base_bdevs_list": [ 00:19:37.826 { 00:19:37.826 "name": "BaseBdev1", 00:19:37.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.826 "is_configured": false, 00:19:37.826 "data_offset": 0, 00:19:37.826 "data_size": 0 00:19:37.826 }, 00:19:37.826 { 00:19:37.826 "name": null, 00:19:37.826 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:37.826 "is_configured": false, 00:19:37.826 "data_offset": 2048, 00:19:37.826 "data_size": 63488 00:19:37.826 }, 00:19:37.826 { 00:19:37.826 "name": "BaseBdev3", 00:19:37.826 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:37.826 "is_configured": true, 00:19:37.826 "data_offset": 2048, 00:19:37.826 "data_size": 63488 00:19:37.826 }, 00:19:37.826 { 00:19:37.826 "name": "BaseBdev4", 00:19:37.826 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:37.826 "is_configured": true, 00:19:37.826 "data_offset": 2048, 00:19:37.826 "data_size": 63488 00:19:37.826 } 00:19:37.826 ] 00:19:37.826 }' 00:19:37.826 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.826 07:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.393 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.393 07:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:38.652 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:38.652 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:38.910 [2024-07-25 07:25:11.289364] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:38.910 BaseBdev1 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:38.910 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:39.169 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:39.428 [ 00:19:39.428 { 00:19:39.428 "name": "BaseBdev1", 00:19:39.428 "aliases": [ 00:19:39.428 "a3fbf59d-b29b-48c7-a298-19039811bc3b" 00:19:39.428 ], 00:19:39.428 "product_name": "Malloc disk", 00:19:39.428 "block_size": 512, 00:19:39.428 "num_blocks": 65536, 00:19:39.428 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:39.428 "assigned_rate_limits": { 00:19:39.428 "rw_ios_per_sec": 0, 00:19:39.428 "rw_mbytes_per_sec": 0, 00:19:39.428 "r_mbytes_per_sec": 0, 00:19:39.428 "w_mbytes_per_sec": 0 00:19:39.428 }, 00:19:39.428 "claimed": true, 00:19:39.428 "claim_type": "exclusive_write", 00:19:39.428 "zoned": false, 00:19:39.428 "supported_io_types": { 00:19:39.428 "read": true, 00:19:39.428 "write": true, 00:19:39.428 "unmap": true, 00:19:39.428 "flush": true, 00:19:39.428 "reset": true, 00:19:39.428 "nvme_admin": false, 00:19:39.428 "nvme_io": false, 00:19:39.428 "nvme_io_md": false, 00:19:39.428 "write_zeroes": true, 00:19:39.428 "zcopy": true, 00:19:39.428 "get_zone_info": false, 00:19:39.428 "zone_management": false, 00:19:39.428 "zone_append": false, 00:19:39.428 "compare": false, 00:19:39.428 "compare_and_write": false, 00:19:39.428 "abort": true, 00:19:39.428 "seek_hole": false, 00:19:39.428 "seek_data": false, 00:19:39.428 "copy": true, 00:19:39.428 "nvme_iov_md": false 00:19:39.428 }, 00:19:39.428 "memory_domains": [ 00:19:39.428 { 00:19:39.428 "dma_device_id": "system", 00:19:39.428 "dma_device_type": 1 00:19:39.428 }, 00:19:39.428 { 00:19:39.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.428 "dma_device_type": 2 00:19:39.428 } 00:19:39.428 ], 00:19:39.428 "driver_specific": {} 00:19:39.428 } 00:19:39.428 ] 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.428 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.687 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.687 "name": "Existed_Raid", 00:19:39.687 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:39.687 "strip_size_kb": 64, 00:19:39.687 "state": "configuring", 00:19:39.687 "raid_level": "raid0", 00:19:39.687 "superblock": true, 00:19:39.687 "num_base_bdevs": 4, 00:19:39.687 "num_base_bdevs_discovered": 3, 00:19:39.687 "num_base_bdevs_operational": 4, 00:19:39.687 "base_bdevs_list": [ 00:19:39.687 { 00:19:39.687 "name": "BaseBdev1", 00:19:39.687 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:39.687 "is_configured": true, 00:19:39.687 "data_offset": 2048, 00:19:39.687 "data_size": 63488 00:19:39.687 }, 00:19:39.687 { 00:19:39.687 "name": null, 00:19:39.687 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:39.687 "is_configured": false, 00:19:39.687 "data_offset": 2048, 00:19:39.687 "data_size": 63488 00:19:39.687 }, 00:19:39.687 { 00:19:39.687 "name": "BaseBdev3", 00:19:39.687 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:39.687 "is_configured": true, 00:19:39.687 "data_offset": 2048, 00:19:39.687 "data_size": 63488 00:19:39.687 }, 00:19:39.687 { 00:19:39.687 "name": "BaseBdev4", 00:19:39.687 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:39.687 "is_configured": true, 00:19:39.687 "data_offset": 2048, 00:19:39.687 "data_size": 63488 00:19:39.687 } 00:19:39.687 ] 00:19:39.687 }' 00:19:39.687 07:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.687 07:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:40.253 07:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.253 07:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:40.253 07:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:40.253 07:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:40.512 [2024-07-25 07:25:12.989845] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.512 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:40.770 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.771 "name": "Existed_Raid", 00:19:40.771 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:40.771 "strip_size_kb": 64, 00:19:40.771 "state": "configuring", 00:19:40.771 "raid_level": "raid0", 00:19:40.771 "superblock": true, 00:19:40.771 "num_base_bdevs": 4, 00:19:40.771 "num_base_bdevs_discovered": 2, 00:19:40.771 "num_base_bdevs_operational": 4, 00:19:40.771 "base_bdevs_list": [ 00:19:40.771 { 00:19:40.771 "name": "BaseBdev1", 00:19:40.771 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:40.771 "is_configured": true, 00:19:40.771 "data_offset": 2048, 00:19:40.771 "data_size": 63488 00:19:40.771 }, 00:19:40.771 { 00:19:40.771 "name": null, 00:19:40.771 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:40.771 "is_configured": false, 00:19:40.771 "data_offset": 2048, 00:19:40.771 "data_size": 63488 00:19:40.771 }, 00:19:40.771 { 00:19:40.771 "name": null, 00:19:40.771 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:40.771 "is_configured": false, 00:19:40.771 "data_offset": 2048, 00:19:40.771 "data_size": 63488 00:19:40.771 }, 00:19:40.771 { 00:19:40.771 "name": "BaseBdev4", 00:19:40.771 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:40.771 "is_configured": true, 00:19:40.771 "data_offset": 2048, 00:19:40.771 "data_size": 63488 00:19:40.771 } 00:19:40.771 ] 00:19:40.771 }' 00:19:40.771 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.771 07:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:41.338 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.338 07:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:41.648 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:41.648 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:41.906 [2024-07-25 07:25:14.233129] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:41.906 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:41.906 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.906 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:41.906 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.907 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:42.165 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.165 "name": "Existed_Raid", 00:19:42.165 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:42.165 "strip_size_kb": 64, 00:19:42.165 "state": "configuring", 00:19:42.165 "raid_level": "raid0", 00:19:42.165 "superblock": true, 00:19:42.165 "num_base_bdevs": 4, 00:19:42.165 "num_base_bdevs_discovered": 3, 00:19:42.165 "num_base_bdevs_operational": 4, 00:19:42.165 "base_bdevs_list": [ 00:19:42.165 { 00:19:42.165 "name": "BaseBdev1", 00:19:42.165 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:42.165 "is_configured": true, 00:19:42.165 "data_offset": 2048, 00:19:42.165 "data_size": 63488 00:19:42.165 }, 00:19:42.165 { 00:19:42.165 "name": null, 00:19:42.165 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:42.165 "is_configured": false, 00:19:42.165 "data_offset": 2048, 00:19:42.165 "data_size": 63488 00:19:42.165 }, 00:19:42.165 { 00:19:42.165 "name": "BaseBdev3", 00:19:42.165 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:42.165 "is_configured": true, 00:19:42.165 "data_offset": 2048, 00:19:42.165 "data_size": 63488 00:19:42.165 }, 00:19:42.165 { 00:19:42.165 "name": "BaseBdev4", 00:19:42.165 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:42.165 "is_configured": true, 00:19:42.165 "data_offset": 2048, 00:19:42.165 "data_size": 63488 00:19:42.165 } 00:19:42.165 ] 00:19:42.165 }' 00:19:42.165 07:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.165 07:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:42.732 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.732 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:42.991 [2024-07-25 07:25:15.480422] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:42.991 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.249 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.249 "name": "Existed_Raid", 00:19:43.249 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:43.249 "strip_size_kb": 64, 00:19:43.249 "state": "configuring", 00:19:43.249 "raid_level": "raid0", 00:19:43.249 "superblock": true, 00:19:43.249 "num_base_bdevs": 4, 00:19:43.249 "num_base_bdevs_discovered": 2, 00:19:43.249 "num_base_bdevs_operational": 4, 00:19:43.249 "base_bdevs_list": [ 00:19:43.249 { 00:19:43.249 "name": null, 00:19:43.249 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:43.249 "is_configured": false, 00:19:43.249 "data_offset": 2048, 00:19:43.249 "data_size": 63488 00:19:43.249 }, 00:19:43.249 { 00:19:43.249 "name": null, 00:19:43.249 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:43.249 "is_configured": false, 00:19:43.249 "data_offset": 2048, 00:19:43.249 "data_size": 63488 00:19:43.249 }, 00:19:43.249 { 00:19:43.249 "name": "BaseBdev3", 00:19:43.249 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:43.249 "is_configured": true, 00:19:43.249 "data_offset": 2048, 00:19:43.249 "data_size": 63488 00:19:43.249 }, 00:19:43.249 { 00:19:43.249 "name": "BaseBdev4", 00:19:43.249 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:43.249 "is_configured": true, 00:19:43.249 "data_offset": 2048, 00:19:43.249 "data_size": 63488 00:19:43.249 } 00:19:43.249 ] 00:19:43.249 }' 00:19:43.249 07:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.249 07:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:43.816 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.816 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:44.075 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:44.075 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:44.333 [2024-07-25 07:25:16.725634] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.333 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.334 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.334 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.334 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.592 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.592 "name": "Existed_Raid", 00:19:44.592 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:44.592 "strip_size_kb": 64, 00:19:44.592 "state": "configuring", 00:19:44.592 "raid_level": "raid0", 00:19:44.592 "superblock": true, 00:19:44.592 "num_base_bdevs": 4, 00:19:44.592 "num_base_bdevs_discovered": 3, 00:19:44.592 "num_base_bdevs_operational": 4, 00:19:44.592 "base_bdevs_list": [ 00:19:44.592 { 00:19:44.592 "name": null, 00:19:44.592 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:44.592 "is_configured": false, 00:19:44.592 "data_offset": 2048, 00:19:44.592 "data_size": 63488 00:19:44.592 }, 00:19:44.592 { 00:19:44.592 "name": "BaseBdev2", 00:19:44.592 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:44.592 "is_configured": true, 00:19:44.592 "data_offset": 2048, 00:19:44.592 "data_size": 63488 00:19:44.592 }, 00:19:44.592 { 00:19:44.592 "name": "BaseBdev3", 00:19:44.592 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:44.592 "is_configured": true, 00:19:44.592 "data_offset": 2048, 00:19:44.592 "data_size": 63488 00:19:44.592 }, 00:19:44.592 { 00:19:44.592 "name": "BaseBdev4", 00:19:44.592 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:44.592 "is_configured": true, 00:19:44.592 "data_offset": 2048, 00:19:44.592 "data_size": 63488 00:19:44.592 } 00:19:44.592 ] 00:19:44.592 }' 00:19:44.592 07:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.592 07:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.159 07:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.159 07:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:45.418 07:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:45.418 07:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:45.418 07:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.677 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a3fbf59d-b29b-48c7-a298-19039811bc3b 00:19:45.936 [2024-07-25 07:25:18.236646] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:45.936 [2024-07-25 07:25:18.236787] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x250a270 00:19:45.936 [2024-07-25 07:25:18.236799] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:45.936 [2024-07-25 07:25:18.236956] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x250ad20 00:19:45.936 [2024-07-25 07:25:18.237061] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250a270 00:19:45.936 [2024-07-25 07:25:18.237070] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250a270 00:19:45.936 [2024-07-25 07:25:18.237161] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:45.936 NewBaseBdev 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:45.936 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:46.195 [ 00:19:46.195 { 00:19:46.195 "name": "NewBaseBdev", 00:19:46.195 "aliases": [ 00:19:46.195 "a3fbf59d-b29b-48c7-a298-19039811bc3b" 00:19:46.195 ], 00:19:46.195 "product_name": "Malloc disk", 00:19:46.195 "block_size": 512, 00:19:46.195 "num_blocks": 65536, 00:19:46.195 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:46.195 "assigned_rate_limits": { 00:19:46.195 "rw_ios_per_sec": 0, 00:19:46.195 "rw_mbytes_per_sec": 0, 00:19:46.195 "r_mbytes_per_sec": 0, 00:19:46.195 "w_mbytes_per_sec": 0 00:19:46.195 }, 00:19:46.195 "claimed": true, 00:19:46.195 "claim_type": "exclusive_write", 00:19:46.195 "zoned": false, 00:19:46.195 "supported_io_types": { 00:19:46.195 "read": true, 00:19:46.195 "write": true, 00:19:46.195 "unmap": true, 00:19:46.195 "flush": true, 00:19:46.195 "reset": true, 00:19:46.195 "nvme_admin": false, 00:19:46.195 "nvme_io": false, 00:19:46.195 "nvme_io_md": false, 00:19:46.195 "write_zeroes": true, 00:19:46.195 "zcopy": true, 00:19:46.195 "get_zone_info": false, 00:19:46.195 "zone_management": false, 00:19:46.195 "zone_append": false, 00:19:46.195 "compare": false, 00:19:46.195 "compare_and_write": false, 00:19:46.195 "abort": true, 00:19:46.195 "seek_hole": false, 00:19:46.195 "seek_data": false, 00:19:46.195 "copy": true, 00:19:46.195 "nvme_iov_md": false 00:19:46.195 }, 00:19:46.195 "memory_domains": [ 00:19:46.195 { 00:19:46.195 "dma_device_id": "system", 00:19:46.195 "dma_device_type": 1 00:19:46.195 }, 00:19:46.195 { 00:19:46.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.195 "dma_device_type": 2 00:19:46.195 } 00:19:46.195 ], 00:19:46.195 "driver_specific": {} 00:19:46.195 } 00:19:46.195 ] 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.195 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.454 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.454 "name": "Existed_Raid", 00:19:46.454 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:46.454 "strip_size_kb": 64, 00:19:46.454 "state": "online", 00:19:46.454 "raid_level": "raid0", 00:19:46.454 "superblock": true, 00:19:46.454 "num_base_bdevs": 4, 00:19:46.454 "num_base_bdevs_discovered": 4, 00:19:46.454 "num_base_bdevs_operational": 4, 00:19:46.454 "base_bdevs_list": [ 00:19:46.454 { 00:19:46.454 "name": "NewBaseBdev", 00:19:46.454 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:46.454 "is_configured": true, 00:19:46.454 "data_offset": 2048, 00:19:46.454 "data_size": 63488 00:19:46.454 }, 00:19:46.454 { 00:19:46.454 "name": "BaseBdev2", 00:19:46.454 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:46.454 "is_configured": true, 00:19:46.454 "data_offset": 2048, 00:19:46.454 "data_size": 63488 00:19:46.454 }, 00:19:46.454 { 00:19:46.454 "name": "BaseBdev3", 00:19:46.454 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:46.454 "is_configured": true, 00:19:46.454 "data_offset": 2048, 00:19:46.454 "data_size": 63488 00:19:46.454 }, 00:19:46.454 { 00:19:46.454 "name": "BaseBdev4", 00:19:46.454 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:46.454 "is_configured": true, 00:19:46.454 "data_offset": 2048, 00:19:46.454 "data_size": 63488 00:19:46.454 } 00:19:46.454 ] 00:19:46.454 }' 00:19:46.454 07:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.454 07:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:47.022 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:47.281 [2024-07-25 07:25:19.744922] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:47.281 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:47.281 "name": "Existed_Raid", 00:19:47.282 "aliases": [ 00:19:47.282 "2bec0bb1-002a-4768-a7ba-69c7ea91069e" 00:19:47.282 ], 00:19:47.282 "product_name": "Raid Volume", 00:19:47.282 "block_size": 512, 00:19:47.282 "num_blocks": 253952, 00:19:47.282 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:47.282 "assigned_rate_limits": { 00:19:47.282 "rw_ios_per_sec": 0, 00:19:47.282 "rw_mbytes_per_sec": 0, 00:19:47.282 "r_mbytes_per_sec": 0, 00:19:47.282 "w_mbytes_per_sec": 0 00:19:47.282 }, 00:19:47.282 "claimed": false, 00:19:47.282 "zoned": false, 00:19:47.282 "supported_io_types": { 00:19:47.282 "read": true, 00:19:47.282 "write": true, 00:19:47.282 "unmap": true, 00:19:47.282 "flush": true, 00:19:47.282 "reset": true, 00:19:47.282 "nvme_admin": false, 00:19:47.282 "nvme_io": false, 00:19:47.282 "nvme_io_md": false, 00:19:47.282 "write_zeroes": true, 00:19:47.282 "zcopy": false, 00:19:47.282 "get_zone_info": false, 00:19:47.282 "zone_management": false, 00:19:47.282 "zone_append": false, 00:19:47.282 "compare": false, 00:19:47.282 "compare_and_write": false, 00:19:47.282 "abort": false, 00:19:47.282 "seek_hole": false, 00:19:47.282 "seek_data": false, 00:19:47.282 "copy": false, 00:19:47.282 "nvme_iov_md": false 00:19:47.282 }, 00:19:47.282 "memory_domains": [ 00:19:47.282 { 00:19:47.282 "dma_device_id": "system", 00:19:47.282 "dma_device_type": 1 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.282 "dma_device_type": 2 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "system", 00:19:47.282 "dma_device_type": 1 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.282 "dma_device_type": 2 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "system", 00:19:47.282 "dma_device_type": 1 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.282 "dma_device_type": 2 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "system", 00:19:47.282 "dma_device_type": 1 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.282 "dma_device_type": 2 00:19:47.282 } 00:19:47.282 ], 00:19:47.282 "driver_specific": { 00:19:47.282 "raid": { 00:19:47.282 "uuid": "2bec0bb1-002a-4768-a7ba-69c7ea91069e", 00:19:47.282 "strip_size_kb": 64, 00:19:47.282 "state": "online", 00:19:47.282 "raid_level": "raid0", 00:19:47.282 "superblock": true, 00:19:47.282 "num_base_bdevs": 4, 00:19:47.282 "num_base_bdevs_discovered": 4, 00:19:47.282 "num_base_bdevs_operational": 4, 00:19:47.282 "base_bdevs_list": [ 00:19:47.282 { 00:19:47.282 "name": "NewBaseBdev", 00:19:47.282 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:47.282 "is_configured": true, 00:19:47.282 "data_offset": 2048, 00:19:47.282 "data_size": 63488 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "name": "BaseBdev2", 00:19:47.282 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:47.282 "is_configured": true, 00:19:47.282 "data_offset": 2048, 00:19:47.282 "data_size": 63488 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "name": "BaseBdev3", 00:19:47.282 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:47.282 "is_configured": true, 00:19:47.282 "data_offset": 2048, 00:19:47.282 "data_size": 63488 00:19:47.282 }, 00:19:47.282 { 00:19:47.282 "name": "BaseBdev4", 00:19:47.282 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:47.282 "is_configured": true, 00:19:47.282 "data_offset": 2048, 00:19:47.282 "data_size": 63488 00:19:47.282 } 00:19:47.282 ] 00:19:47.282 } 00:19:47.282 } 00:19:47.282 }' 00:19:47.282 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:47.541 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:47.541 BaseBdev2 00:19:47.541 BaseBdev3 00:19:47.541 BaseBdev4' 00:19:47.541 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.541 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:47.541 07:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.541 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.541 "name": "NewBaseBdev", 00:19:47.541 "aliases": [ 00:19:47.541 "a3fbf59d-b29b-48c7-a298-19039811bc3b" 00:19:47.541 ], 00:19:47.541 "product_name": "Malloc disk", 00:19:47.541 "block_size": 512, 00:19:47.541 "num_blocks": 65536, 00:19:47.541 "uuid": "a3fbf59d-b29b-48c7-a298-19039811bc3b", 00:19:47.541 "assigned_rate_limits": { 00:19:47.541 "rw_ios_per_sec": 0, 00:19:47.541 "rw_mbytes_per_sec": 0, 00:19:47.541 "r_mbytes_per_sec": 0, 00:19:47.541 "w_mbytes_per_sec": 0 00:19:47.541 }, 00:19:47.541 "claimed": true, 00:19:47.541 "claim_type": "exclusive_write", 00:19:47.541 "zoned": false, 00:19:47.541 "supported_io_types": { 00:19:47.541 "read": true, 00:19:47.541 "write": true, 00:19:47.541 "unmap": true, 00:19:47.541 "flush": true, 00:19:47.541 "reset": true, 00:19:47.541 "nvme_admin": false, 00:19:47.541 "nvme_io": false, 00:19:47.541 "nvme_io_md": false, 00:19:47.541 "write_zeroes": true, 00:19:47.541 "zcopy": true, 00:19:47.541 "get_zone_info": false, 00:19:47.541 "zone_management": false, 00:19:47.541 "zone_append": false, 00:19:47.541 "compare": false, 00:19:47.541 "compare_and_write": false, 00:19:47.541 "abort": true, 00:19:47.541 "seek_hole": false, 00:19:47.541 "seek_data": false, 00:19:47.541 "copy": true, 00:19:47.541 "nvme_iov_md": false 00:19:47.541 }, 00:19:47.541 "memory_domains": [ 00:19:47.541 { 00:19:47.541 "dma_device_id": "system", 00:19:47.541 "dma_device_type": 1 00:19:47.541 }, 00:19:47.541 { 00:19:47.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.541 "dma_device_type": 2 00:19:47.541 } 00:19:47.541 ], 00:19:47.541 "driver_specific": {} 00:19:47.541 }' 00:19:47.541 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.800 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.057 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.057 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.057 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:48.057 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:48.057 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:48.625 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:48.625 "name": "BaseBdev2", 00:19:48.625 "aliases": [ 00:19:48.625 "d82763ac-c687-4a38-8b6b-16e2fed94ac0" 00:19:48.625 ], 00:19:48.625 "product_name": "Malloc disk", 00:19:48.625 "block_size": 512, 00:19:48.625 "num_blocks": 65536, 00:19:48.625 "uuid": "d82763ac-c687-4a38-8b6b-16e2fed94ac0", 00:19:48.625 "assigned_rate_limits": { 00:19:48.625 "rw_ios_per_sec": 0, 00:19:48.625 "rw_mbytes_per_sec": 0, 00:19:48.625 "r_mbytes_per_sec": 0, 00:19:48.625 "w_mbytes_per_sec": 0 00:19:48.625 }, 00:19:48.625 "claimed": true, 00:19:48.625 "claim_type": "exclusive_write", 00:19:48.625 "zoned": false, 00:19:48.625 "supported_io_types": { 00:19:48.625 "read": true, 00:19:48.625 "write": true, 00:19:48.625 "unmap": true, 00:19:48.625 "flush": true, 00:19:48.625 "reset": true, 00:19:48.625 "nvme_admin": false, 00:19:48.625 "nvme_io": false, 00:19:48.625 "nvme_io_md": false, 00:19:48.625 "write_zeroes": true, 00:19:48.625 "zcopy": true, 00:19:48.625 "get_zone_info": false, 00:19:48.625 "zone_management": false, 00:19:48.625 "zone_append": false, 00:19:48.625 "compare": false, 00:19:48.625 "compare_and_write": false, 00:19:48.625 "abort": true, 00:19:48.625 "seek_hole": false, 00:19:48.625 "seek_data": false, 00:19:48.625 "copy": true, 00:19:48.625 "nvme_iov_md": false 00:19:48.625 }, 00:19:48.625 "memory_domains": [ 00:19:48.625 { 00:19:48.625 "dma_device_id": "system", 00:19:48.625 "dma_device_type": 1 00:19:48.625 }, 00:19:48.625 { 00:19:48.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.625 "dma_device_type": 2 00:19:48.625 } 00:19:48.625 ], 00:19:48.625 "driver_specific": {} 00:19:48.625 }' 00:19:48.625 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.625 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.625 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:48.625 07:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.625 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.625 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:48.625 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.625 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.625 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:48.625 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.884 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.884 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.884 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:48.884 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:48.884 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.143 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.143 "name": "BaseBdev3", 00:19:49.143 "aliases": [ 00:19:49.143 "ff1c324d-a853-44f4-9636-247b9651e9eb" 00:19:49.143 ], 00:19:49.143 "product_name": "Malloc disk", 00:19:49.143 "block_size": 512, 00:19:49.143 "num_blocks": 65536, 00:19:49.143 "uuid": "ff1c324d-a853-44f4-9636-247b9651e9eb", 00:19:49.143 "assigned_rate_limits": { 00:19:49.143 "rw_ios_per_sec": 0, 00:19:49.143 "rw_mbytes_per_sec": 0, 00:19:49.143 "r_mbytes_per_sec": 0, 00:19:49.143 "w_mbytes_per_sec": 0 00:19:49.143 }, 00:19:49.143 "claimed": true, 00:19:49.143 "claim_type": "exclusive_write", 00:19:49.143 "zoned": false, 00:19:49.143 "supported_io_types": { 00:19:49.143 "read": true, 00:19:49.143 "write": true, 00:19:49.143 "unmap": true, 00:19:49.143 "flush": true, 00:19:49.143 "reset": true, 00:19:49.143 "nvme_admin": false, 00:19:49.143 "nvme_io": false, 00:19:49.143 "nvme_io_md": false, 00:19:49.143 "write_zeroes": true, 00:19:49.143 "zcopy": true, 00:19:49.143 "get_zone_info": false, 00:19:49.143 "zone_management": false, 00:19:49.143 "zone_append": false, 00:19:49.143 "compare": false, 00:19:49.143 "compare_and_write": false, 00:19:49.143 "abort": true, 00:19:49.143 "seek_hole": false, 00:19:49.143 "seek_data": false, 00:19:49.143 "copy": true, 00:19:49.143 "nvme_iov_md": false 00:19:49.143 }, 00:19:49.143 "memory_domains": [ 00:19:49.143 { 00:19:49.143 "dma_device_id": "system", 00:19:49.143 "dma_device_type": 1 00:19:49.143 }, 00:19:49.143 { 00:19:49.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.143 "dma_device_type": 2 00:19:49.143 } 00:19:49.143 ], 00:19:49.143 "driver_specific": {} 00:19:49.143 }' 00:19:49.143 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.143 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.143 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.143 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.143 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.144 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:49.144 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.144 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:49.402 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.661 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.661 "name": "BaseBdev4", 00:19:49.661 "aliases": [ 00:19:49.661 "679442e0-2724-4c86-94e8-df0cd74e1558" 00:19:49.661 ], 00:19:49.661 "product_name": "Malloc disk", 00:19:49.661 "block_size": 512, 00:19:49.661 "num_blocks": 65536, 00:19:49.661 "uuid": "679442e0-2724-4c86-94e8-df0cd74e1558", 00:19:49.661 "assigned_rate_limits": { 00:19:49.661 "rw_ios_per_sec": 0, 00:19:49.661 "rw_mbytes_per_sec": 0, 00:19:49.661 "r_mbytes_per_sec": 0, 00:19:49.661 "w_mbytes_per_sec": 0 00:19:49.661 }, 00:19:49.661 "claimed": true, 00:19:49.661 "claim_type": "exclusive_write", 00:19:49.661 "zoned": false, 00:19:49.661 "supported_io_types": { 00:19:49.661 "read": true, 00:19:49.661 "write": true, 00:19:49.661 "unmap": true, 00:19:49.661 "flush": true, 00:19:49.661 "reset": true, 00:19:49.661 "nvme_admin": false, 00:19:49.661 "nvme_io": false, 00:19:49.661 "nvme_io_md": false, 00:19:49.661 "write_zeroes": true, 00:19:49.661 "zcopy": true, 00:19:49.661 "get_zone_info": false, 00:19:49.661 "zone_management": false, 00:19:49.661 "zone_append": false, 00:19:49.661 "compare": false, 00:19:49.661 "compare_and_write": false, 00:19:49.661 "abort": true, 00:19:49.661 "seek_hole": false, 00:19:49.661 "seek_data": false, 00:19:49.661 "copy": true, 00:19:49.661 "nvme_iov_md": false 00:19:49.661 }, 00:19:49.661 "memory_domains": [ 00:19:49.661 { 00:19:49.661 "dma_device_id": "system", 00:19:49.661 "dma_device_type": 1 00:19:49.661 }, 00:19:49.661 { 00:19:49.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.661 "dma_device_type": 2 00:19:49.661 } 00:19:49.661 ], 00:19:49.661 "driver_specific": {} 00:19:49.662 }' 00:19:49.662 07:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.662 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.662 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.662 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.662 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.662 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:49.662 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.921 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.921 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:49.921 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.921 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.921 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:49.921 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:50.180 [2024-07-25 07:25:22.455782] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:50.180 [2024-07-25 07:25:22.455806] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:50.180 [2024-07-25 07:25:22.455851] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:50.180 [2024-07-25 07:25:22.455908] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:50.180 [2024-07-25 07:25:22.455920] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250a270 name Existed_Raid, state offline 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1666495 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1666495 ']' 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1666495 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1666495 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1666495' 00:19:50.180 killing process with pid 1666495 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1666495 00:19:50.180 [2024-07-25 07:25:22.554534] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:50.180 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1666495 00:19:50.180 [2024-07-25 07:25:22.585819] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:50.439 07:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:50.439 00:19:50.439 real 0m30.754s 00:19:50.439 user 0m56.537s 00:19:50.439 sys 0m5.481s 00:19:50.439 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:50.439 07:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:50.439 ************************************ 00:19:50.439 END TEST raid_state_function_test_sb 00:19:50.439 ************************************ 00:19:50.439 07:25:22 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:19:50.439 07:25:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:19:50.439 07:25:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:50.439 07:25:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:50.439 ************************************ 00:19:50.439 START TEST raid_superblock_test 00:19:50.439 ************************************ 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 4 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1672427 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1672427 /var/tmp/spdk-raid.sock 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1672427 ']' 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:50.440 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:50.440 07:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.440 [2024-07-25 07:25:22.916403] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:19:50.440 [2024-07-25 07:25:22.916450] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1672427 ] 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.0 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.1 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.2 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.3 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.4 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.5 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.6 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:01.7 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:02.0 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:02.1 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:02.2 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:02.3 cannot be used 00:19:50.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.440 EAL: Requested device 0000:3d:02.4 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3d:02.5 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3d:02.6 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3d:02.7 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.0 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.1 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.2 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.3 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.4 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.5 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.6 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:01.7 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.0 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.1 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.2 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.3 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.4 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.5 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.6 cannot be used 00:19:50.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:50.699 EAL: Requested device 0000:3f:02.7 cannot be used 00:19:50.699 [2024-07-25 07:25:23.032079] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:50.699 [2024-07-25 07:25:23.113398] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:19:50.699 [2024-07-25 07:25:23.173611] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:50.699 [2024-07-25 07:25:23.173649] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:51.635 07:25:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:51.635 malloc1 00:19:51.635 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:51.894 [2024-07-25 07:25:24.267924] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:51.894 [2024-07-25 07:25:24.267973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.894 [2024-07-25 07:25:24.267991] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1203280 00:19:51.894 [2024-07-25 07:25:24.268003] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.894 [2024-07-25 07:25:24.269515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.894 [2024-07-25 07:25:24.269545] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:51.894 pt1 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:51.894 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:52.152 malloc2 00:19:52.153 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:52.411 [2024-07-25 07:25:24.733699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:52.411 [2024-07-25 07:25:24.733741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:52.411 [2024-07-25 07:25:24.733757] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ae8c0 00:19:52.411 [2024-07-25 07:25:24.733768] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:52.412 [2024-07-25 07:25:24.735056] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:52.412 [2024-07-25 07:25:24.735083] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:52.412 pt2 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:52.412 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:52.670 malloc3 00:19:52.670 07:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:52.670 [2024-07-25 07:25:25.195095] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:52.670 [2024-07-25 07:25:25.195133] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:52.670 [2024-07-25 07:25:25.195157] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13aeef0 00:19:52.670 [2024-07-25 07:25:25.195169] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:52.670 [2024-07-25 07:25:25.196410] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:52.670 [2024-07-25 07:25:25.196436] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:52.670 pt3 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:52.929 malloc4 00:19:52.929 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:53.188 [2024-07-25 07:25:25.648424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:53.188 [2024-07-25 07:25:25.648462] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:53.188 [2024-07-25 07:25:25.648478] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b2330 00:19:53.188 [2024-07-25 07:25:25.648489] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:53.188 [2024-07-25 07:25:25.649749] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:53.188 [2024-07-25 07:25:25.649776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:53.188 pt4 00:19:53.188 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:53.188 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:53.188 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:53.446 [2024-07-25 07:25:25.873036] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:53.446 [2024-07-25 07:25:25.874158] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:53.446 [2024-07-25 07:25:25.874210] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:53.446 [2024-07-25 07:25:25.874249] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:53.446 [2024-07-25 07:25:25.874408] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13b1720 00:19:53.446 [2024-07-25 07:25:25.874418] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:53.446 [2024-07-25 07:25:25.874596] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13b4e30 00:19:53.446 [2024-07-25 07:25:25.874723] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13b1720 00:19:53.446 [2024-07-25 07:25:25.874733] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13b1720 00:19:53.446 [2024-07-25 07:25:25.874816] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.446 07:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.705 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.705 "name": "raid_bdev1", 00:19:53.705 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:19:53.705 "strip_size_kb": 64, 00:19:53.705 "state": "online", 00:19:53.705 "raid_level": "raid0", 00:19:53.705 "superblock": true, 00:19:53.705 "num_base_bdevs": 4, 00:19:53.705 "num_base_bdevs_discovered": 4, 00:19:53.705 "num_base_bdevs_operational": 4, 00:19:53.705 "base_bdevs_list": [ 00:19:53.705 { 00:19:53.705 "name": "pt1", 00:19:53.705 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:53.705 "is_configured": true, 00:19:53.705 "data_offset": 2048, 00:19:53.705 "data_size": 63488 00:19:53.705 }, 00:19:53.705 { 00:19:53.705 "name": "pt2", 00:19:53.705 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:53.705 "is_configured": true, 00:19:53.705 "data_offset": 2048, 00:19:53.705 "data_size": 63488 00:19:53.705 }, 00:19:53.705 { 00:19:53.705 "name": "pt3", 00:19:53.705 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.705 "is_configured": true, 00:19:53.705 "data_offset": 2048, 00:19:53.705 "data_size": 63488 00:19:53.705 }, 00:19:53.705 { 00:19:53.705 "name": "pt4", 00:19:53.705 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:53.705 "is_configured": true, 00:19:53.705 "data_offset": 2048, 00:19:53.705 "data_size": 63488 00:19:53.705 } 00:19:53.705 ] 00:19:53.705 }' 00:19:53.705 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.705 07:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:54.271 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:54.602 [2024-07-25 07:25:26.948130] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:54.602 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:54.602 "name": "raid_bdev1", 00:19:54.602 "aliases": [ 00:19:54.602 "19b304d9-66d0-42bd-a924-f37509d92046" 00:19:54.602 ], 00:19:54.602 "product_name": "Raid Volume", 00:19:54.602 "block_size": 512, 00:19:54.602 "num_blocks": 253952, 00:19:54.602 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:19:54.602 "assigned_rate_limits": { 00:19:54.602 "rw_ios_per_sec": 0, 00:19:54.602 "rw_mbytes_per_sec": 0, 00:19:54.602 "r_mbytes_per_sec": 0, 00:19:54.602 "w_mbytes_per_sec": 0 00:19:54.602 }, 00:19:54.602 "claimed": false, 00:19:54.602 "zoned": false, 00:19:54.602 "supported_io_types": { 00:19:54.602 "read": true, 00:19:54.602 "write": true, 00:19:54.602 "unmap": true, 00:19:54.602 "flush": true, 00:19:54.602 "reset": true, 00:19:54.602 "nvme_admin": false, 00:19:54.602 "nvme_io": false, 00:19:54.602 "nvme_io_md": false, 00:19:54.602 "write_zeroes": true, 00:19:54.602 "zcopy": false, 00:19:54.602 "get_zone_info": false, 00:19:54.602 "zone_management": false, 00:19:54.602 "zone_append": false, 00:19:54.602 "compare": false, 00:19:54.602 "compare_and_write": false, 00:19:54.602 "abort": false, 00:19:54.602 "seek_hole": false, 00:19:54.602 "seek_data": false, 00:19:54.602 "copy": false, 00:19:54.602 "nvme_iov_md": false 00:19:54.602 }, 00:19:54.602 "memory_domains": [ 00:19:54.602 { 00:19:54.602 "dma_device_id": "system", 00:19:54.602 "dma_device_type": 1 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.602 "dma_device_type": 2 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "system", 00:19:54.602 "dma_device_type": 1 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.602 "dma_device_type": 2 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "system", 00:19:54.602 "dma_device_type": 1 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.602 "dma_device_type": 2 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "system", 00:19:54.602 "dma_device_type": 1 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.602 "dma_device_type": 2 00:19:54.602 } 00:19:54.602 ], 00:19:54.602 "driver_specific": { 00:19:54.602 "raid": { 00:19:54.602 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:19:54.602 "strip_size_kb": 64, 00:19:54.602 "state": "online", 00:19:54.602 "raid_level": "raid0", 00:19:54.602 "superblock": true, 00:19:54.602 "num_base_bdevs": 4, 00:19:54.602 "num_base_bdevs_discovered": 4, 00:19:54.602 "num_base_bdevs_operational": 4, 00:19:54.602 "base_bdevs_list": [ 00:19:54.602 { 00:19:54.602 "name": "pt1", 00:19:54.602 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:54.602 "is_configured": true, 00:19:54.602 "data_offset": 2048, 00:19:54.602 "data_size": 63488 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "name": "pt2", 00:19:54.602 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:54.602 "is_configured": true, 00:19:54.602 "data_offset": 2048, 00:19:54.602 "data_size": 63488 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "name": "pt3", 00:19:54.602 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:54.602 "is_configured": true, 00:19:54.602 "data_offset": 2048, 00:19:54.602 "data_size": 63488 00:19:54.602 }, 00:19:54.602 { 00:19:54.602 "name": "pt4", 00:19:54.602 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:54.602 "is_configured": true, 00:19:54.602 "data_offset": 2048, 00:19:54.602 "data_size": 63488 00:19:54.602 } 00:19:54.602 ] 00:19:54.602 } 00:19:54.602 } 00:19:54.602 }' 00:19:54.602 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:54.602 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:54.602 pt2 00:19:54.602 pt3 00:19:54.602 pt4' 00:19:54.602 07:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.602 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:54.602 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.170 "name": "pt1", 00:19:55.170 "aliases": [ 00:19:55.170 "00000000-0000-0000-0000-000000000001" 00:19:55.170 ], 00:19:55.170 "product_name": "passthru", 00:19:55.170 "block_size": 512, 00:19:55.170 "num_blocks": 65536, 00:19:55.170 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:55.170 "assigned_rate_limits": { 00:19:55.170 "rw_ios_per_sec": 0, 00:19:55.170 "rw_mbytes_per_sec": 0, 00:19:55.170 "r_mbytes_per_sec": 0, 00:19:55.170 "w_mbytes_per_sec": 0 00:19:55.170 }, 00:19:55.170 "claimed": true, 00:19:55.170 "claim_type": "exclusive_write", 00:19:55.170 "zoned": false, 00:19:55.170 "supported_io_types": { 00:19:55.170 "read": true, 00:19:55.170 "write": true, 00:19:55.170 "unmap": true, 00:19:55.170 "flush": true, 00:19:55.170 "reset": true, 00:19:55.170 "nvme_admin": false, 00:19:55.170 "nvme_io": false, 00:19:55.170 "nvme_io_md": false, 00:19:55.170 "write_zeroes": true, 00:19:55.170 "zcopy": true, 00:19:55.170 "get_zone_info": false, 00:19:55.170 "zone_management": false, 00:19:55.170 "zone_append": false, 00:19:55.170 "compare": false, 00:19:55.170 "compare_and_write": false, 00:19:55.170 "abort": true, 00:19:55.170 "seek_hole": false, 00:19:55.170 "seek_data": false, 00:19:55.170 "copy": true, 00:19:55.170 "nvme_iov_md": false 00:19:55.170 }, 00:19:55.170 "memory_domains": [ 00:19:55.170 { 00:19:55.170 "dma_device_id": "system", 00:19:55.170 "dma_device_type": 1 00:19:55.170 }, 00:19:55.170 { 00:19:55.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.170 "dma_device_type": 2 00:19:55.170 } 00:19:55.170 ], 00:19:55.170 "driver_specific": { 00:19:55.170 "passthru": { 00:19:55.170 "name": "pt1", 00:19:55.170 "base_bdev_name": "malloc1" 00:19:55.170 } 00:19:55.170 } 00:19:55.170 }' 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.170 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:55.428 07:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.687 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.687 "name": "pt2", 00:19:55.687 "aliases": [ 00:19:55.687 "00000000-0000-0000-0000-000000000002" 00:19:55.687 ], 00:19:55.687 "product_name": "passthru", 00:19:55.687 "block_size": 512, 00:19:55.687 "num_blocks": 65536, 00:19:55.687 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:55.687 "assigned_rate_limits": { 00:19:55.687 "rw_ios_per_sec": 0, 00:19:55.687 "rw_mbytes_per_sec": 0, 00:19:55.687 "r_mbytes_per_sec": 0, 00:19:55.687 "w_mbytes_per_sec": 0 00:19:55.687 }, 00:19:55.687 "claimed": true, 00:19:55.687 "claim_type": "exclusive_write", 00:19:55.687 "zoned": false, 00:19:55.687 "supported_io_types": { 00:19:55.687 "read": true, 00:19:55.687 "write": true, 00:19:55.687 "unmap": true, 00:19:55.687 "flush": true, 00:19:55.687 "reset": true, 00:19:55.687 "nvme_admin": false, 00:19:55.687 "nvme_io": false, 00:19:55.687 "nvme_io_md": false, 00:19:55.687 "write_zeroes": true, 00:19:55.687 "zcopy": true, 00:19:55.687 "get_zone_info": false, 00:19:55.687 "zone_management": false, 00:19:55.687 "zone_append": false, 00:19:55.687 "compare": false, 00:19:55.687 "compare_and_write": false, 00:19:55.687 "abort": true, 00:19:55.687 "seek_hole": false, 00:19:55.687 "seek_data": false, 00:19:55.687 "copy": true, 00:19:55.687 "nvme_iov_md": false 00:19:55.687 }, 00:19:55.687 "memory_domains": [ 00:19:55.687 { 00:19:55.687 "dma_device_id": "system", 00:19:55.687 "dma_device_type": 1 00:19:55.687 }, 00:19:55.687 { 00:19:55.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.687 "dma_device_type": 2 00:19:55.687 } 00:19:55.687 ], 00:19:55.687 "driver_specific": { 00:19:55.687 "passthru": { 00:19:55.687 "name": "pt2", 00:19:55.687 "base_bdev_name": "malloc2" 00:19:55.687 } 00:19:55.687 } 00:19:55.687 }' 00:19:55.687 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.687 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.687 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.687 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.687 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:55.945 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.204 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.204 "name": "pt3", 00:19:56.204 "aliases": [ 00:19:56.204 "00000000-0000-0000-0000-000000000003" 00:19:56.204 ], 00:19:56.204 "product_name": "passthru", 00:19:56.204 "block_size": 512, 00:19:56.204 "num_blocks": 65536, 00:19:56.204 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:56.204 "assigned_rate_limits": { 00:19:56.204 "rw_ios_per_sec": 0, 00:19:56.204 "rw_mbytes_per_sec": 0, 00:19:56.204 "r_mbytes_per_sec": 0, 00:19:56.204 "w_mbytes_per_sec": 0 00:19:56.204 }, 00:19:56.204 "claimed": true, 00:19:56.204 "claim_type": "exclusive_write", 00:19:56.204 "zoned": false, 00:19:56.204 "supported_io_types": { 00:19:56.204 "read": true, 00:19:56.204 "write": true, 00:19:56.204 "unmap": true, 00:19:56.204 "flush": true, 00:19:56.204 "reset": true, 00:19:56.204 "nvme_admin": false, 00:19:56.204 "nvme_io": false, 00:19:56.204 "nvme_io_md": false, 00:19:56.204 "write_zeroes": true, 00:19:56.204 "zcopy": true, 00:19:56.204 "get_zone_info": false, 00:19:56.204 "zone_management": false, 00:19:56.204 "zone_append": false, 00:19:56.204 "compare": false, 00:19:56.204 "compare_and_write": false, 00:19:56.204 "abort": true, 00:19:56.204 "seek_hole": false, 00:19:56.204 "seek_data": false, 00:19:56.204 "copy": true, 00:19:56.204 "nvme_iov_md": false 00:19:56.204 }, 00:19:56.204 "memory_domains": [ 00:19:56.204 { 00:19:56.204 "dma_device_id": "system", 00:19:56.204 "dma_device_type": 1 00:19:56.204 }, 00:19:56.204 { 00:19:56.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.204 "dma_device_type": 2 00:19:56.204 } 00:19:56.204 ], 00:19:56.204 "driver_specific": { 00:19:56.204 "passthru": { 00:19:56.204 "name": "pt3", 00:19:56.204 "base_bdev_name": "malloc3" 00:19:56.204 } 00:19:56.204 } 00:19:56.204 }' 00:19:56.204 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.204 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.204 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.204 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:56.463 07:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.721 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.721 "name": "pt4", 00:19:56.721 "aliases": [ 00:19:56.721 "00000000-0000-0000-0000-000000000004" 00:19:56.721 ], 00:19:56.721 "product_name": "passthru", 00:19:56.721 "block_size": 512, 00:19:56.721 "num_blocks": 65536, 00:19:56.721 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:56.721 "assigned_rate_limits": { 00:19:56.721 "rw_ios_per_sec": 0, 00:19:56.721 "rw_mbytes_per_sec": 0, 00:19:56.721 "r_mbytes_per_sec": 0, 00:19:56.721 "w_mbytes_per_sec": 0 00:19:56.721 }, 00:19:56.721 "claimed": true, 00:19:56.721 "claim_type": "exclusive_write", 00:19:56.721 "zoned": false, 00:19:56.721 "supported_io_types": { 00:19:56.721 "read": true, 00:19:56.721 "write": true, 00:19:56.721 "unmap": true, 00:19:56.721 "flush": true, 00:19:56.721 "reset": true, 00:19:56.721 "nvme_admin": false, 00:19:56.721 "nvme_io": false, 00:19:56.721 "nvme_io_md": false, 00:19:56.721 "write_zeroes": true, 00:19:56.721 "zcopy": true, 00:19:56.722 "get_zone_info": false, 00:19:56.722 "zone_management": false, 00:19:56.722 "zone_append": false, 00:19:56.722 "compare": false, 00:19:56.722 "compare_and_write": false, 00:19:56.722 "abort": true, 00:19:56.722 "seek_hole": false, 00:19:56.722 "seek_data": false, 00:19:56.722 "copy": true, 00:19:56.722 "nvme_iov_md": false 00:19:56.722 }, 00:19:56.722 "memory_domains": [ 00:19:56.722 { 00:19:56.722 "dma_device_id": "system", 00:19:56.722 "dma_device_type": 1 00:19:56.722 }, 00:19:56.722 { 00:19:56.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.722 "dma_device_type": 2 00:19:56.722 } 00:19:56.722 ], 00:19:56.722 "driver_specific": { 00:19:56.722 "passthru": { 00:19:56.722 "name": "pt4", 00:19:56.722 "base_bdev_name": "malloc4" 00:19:56.722 } 00:19:56.722 } 00:19:56.722 }' 00:19:56.722 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.980 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.238 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.238 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:57.238 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:19:57.238 [2024-07-25 07:25:29.671283] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:57.238 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=19b304d9-66d0-42bd-a924-f37509d92046 00:19:57.238 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 19b304d9-66d0-42bd-a924-f37509d92046 ']' 00:19:57.238 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:57.497 [2024-07-25 07:25:29.903617] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:57.497 [2024-07-25 07:25:29.903635] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:57.497 [2024-07-25 07:25:29.903679] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:57.497 [2024-07-25 07:25:29.903735] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:57.497 [2024-07-25 07:25:29.903746] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b1720 name raid_bdev1, state offline 00:19:57.497 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.497 07:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:19:57.756 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:19:57.756 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:19:57.756 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:57.756 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:58.015 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:58.015 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:58.273 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:58.273 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:58.532 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:58.532 07:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:58.790 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:59.049 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:59.049 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:59.049 [2024-07-25 07:25:31.535852] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:59.049 [2024-07-25 07:25:31.537108] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:59.049 [2024-07-25 07:25:31.537158] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:59.049 [2024-07-25 07:25:31.537190] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:59.049 [2024-07-25 07:25:31.537231] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:59.049 [2024-07-25 07:25:31.537275] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:59.049 [2024-07-25 07:25:31.537297] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:59.049 [2024-07-25 07:25:31.537318] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:59.049 [2024-07-25 07:25:31.537335] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:59.049 [2024-07-25 07:25:31.537345] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b4b00 name raid_bdev1, state configuring 00:19:59.049 request: 00:19:59.049 { 00:19:59.049 "name": "raid_bdev1", 00:19:59.050 "raid_level": "raid0", 00:19:59.050 "base_bdevs": [ 00:19:59.050 "malloc1", 00:19:59.050 "malloc2", 00:19:59.050 "malloc3", 00:19:59.050 "malloc4" 00:19:59.050 ], 00:19:59.050 "strip_size_kb": 64, 00:19:59.050 "superblock": false, 00:19:59.050 "method": "bdev_raid_create", 00:19:59.050 "req_id": 1 00:19:59.050 } 00:19:59.050 Got JSON-RPC error response 00:19:59.050 response: 00:19:59.050 { 00:19:59.050 "code": -17, 00:19:59.050 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:59.050 } 00:19:59.050 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:19:59.050 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:19:59.050 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:19:59.050 07:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:19:59.050 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.050 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:19:59.309 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:19:59.309 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:19:59.309 07:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:59.568 [2024-07-25 07:25:32.001015] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:59.568 [2024-07-25 07:25:32.001067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:59.568 [2024-07-25 07:25:32.001085] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ac490 00:19:59.568 [2024-07-25 07:25:32.001097] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:59.568 [2024-07-25 07:25:32.002556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:59.568 [2024-07-25 07:25:32.002584] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:59.568 [2024-07-25 07:25:32.002644] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:59.568 [2024-07-25 07:25:32.002669] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:59.568 pt1 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.568 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.827 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.827 "name": "raid_bdev1", 00:19:59.827 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:19:59.827 "strip_size_kb": 64, 00:19:59.827 "state": "configuring", 00:19:59.827 "raid_level": "raid0", 00:19:59.827 "superblock": true, 00:19:59.827 "num_base_bdevs": 4, 00:19:59.827 "num_base_bdevs_discovered": 1, 00:19:59.827 "num_base_bdevs_operational": 4, 00:19:59.827 "base_bdevs_list": [ 00:19:59.827 { 00:19:59.827 "name": "pt1", 00:19:59.827 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:59.827 "is_configured": true, 00:19:59.827 "data_offset": 2048, 00:19:59.827 "data_size": 63488 00:19:59.827 }, 00:19:59.827 { 00:19:59.827 "name": null, 00:19:59.827 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:59.827 "is_configured": false, 00:19:59.827 "data_offset": 2048, 00:19:59.827 "data_size": 63488 00:19:59.827 }, 00:19:59.827 { 00:19:59.827 "name": null, 00:19:59.827 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:59.827 "is_configured": false, 00:19:59.827 "data_offset": 2048, 00:19:59.827 "data_size": 63488 00:19:59.827 }, 00:19:59.827 { 00:19:59.827 "name": null, 00:19:59.827 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:59.827 "is_configured": false, 00:19:59.827 "data_offset": 2048, 00:19:59.827 "data_size": 63488 00:19:59.827 } 00:19:59.827 ] 00:19:59.827 }' 00:19:59.827 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.827 07:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.393 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:20:00.393 07:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:00.651 [2024-07-25 07:25:32.995711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:00.651 [2024-07-25 07:25:32.995760] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.651 [2024-07-25 07:25:32.995778] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b2f00 00:20:00.651 [2024-07-25 07:25:32.995790] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.651 [2024-07-25 07:25:32.996118] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.651 [2024-07-25 07:25:32.996134] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:00.651 [2024-07-25 07:25:32.996198] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:00.651 [2024-07-25 07:25:32.996217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:00.651 pt2 00:20:00.651 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:00.909 [2024-07-25 07:25:33.224321] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:00.909 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.167 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.167 "name": "raid_bdev1", 00:20:01.167 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:20:01.167 "strip_size_kb": 64, 00:20:01.167 "state": "configuring", 00:20:01.167 "raid_level": "raid0", 00:20:01.167 "superblock": true, 00:20:01.167 "num_base_bdevs": 4, 00:20:01.167 "num_base_bdevs_discovered": 1, 00:20:01.167 "num_base_bdevs_operational": 4, 00:20:01.167 "base_bdevs_list": [ 00:20:01.167 { 00:20:01.167 "name": "pt1", 00:20:01.167 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:01.167 "is_configured": true, 00:20:01.167 "data_offset": 2048, 00:20:01.167 "data_size": 63488 00:20:01.167 }, 00:20:01.167 { 00:20:01.167 "name": null, 00:20:01.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:01.167 "is_configured": false, 00:20:01.167 "data_offset": 2048, 00:20:01.167 "data_size": 63488 00:20:01.167 }, 00:20:01.167 { 00:20:01.167 "name": null, 00:20:01.167 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:01.167 "is_configured": false, 00:20:01.167 "data_offset": 2048, 00:20:01.167 "data_size": 63488 00:20:01.167 }, 00:20:01.167 { 00:20:01.167 "name": null, 00:20:01.167 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:01.167 "is_configured": false, 00:20:01.167 "data_offset": 2048, 00:20:01.167 "data_size": 63488 00:20:01.167 } 00:20:01.167 ] 00:20:01.167 }' 00:20:01.167 07:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.167 07:25:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:01.733 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:20:01.733 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:01.733 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:01.733 [2024-07-25 07:25:34.263048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:01.733 [2024-07-25 07:25:34.263100] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.733 [2024-07-25 07:25:34.263120] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12036d0 00:20:01.733 [2024-07-25 07:25:34.263131] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.733 [2024-07-25 07:25:34.263457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.733 [2024-07-25 07:25:34.263474] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:01.733 [2024-07-25 07:25:34.263531] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:01.733 [2024-07-25 07:25:34.263548] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:01.991 pt2 00:20:01.991 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:20:01.991 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:01.991 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:01.991 [2024-07-25 07:25:34.487639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:01.991 [2024-07-25 07:25:34.487667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.991 [2024-07-25 07:25:34.487681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b1280 00:20:01.991 [2024-07-25 07:25:34.487692] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.991 [2024-07-25 07:25:34.487947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.991 [2024-07-25 07:25:34.487962] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:01.991 [2024-07-25 07:25:34.488006] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:01.991 [2024-07-25 07:25:34.488022] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:01.991 pt3 00:20:01.991 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:20:01.991 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:01.991 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:02.249 [2024-07-25 07:25:34.716243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:02.249 [2024-07-25 07:25:34.716271] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.249 [2024-07-25 07:25:34.716284] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b47f0 00:20:02.249 [2024-07-25 07:25:34.716295] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.249 [2024-07-25 07:25:34.716539] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.249 [2024-07-25 07:25:34.716553] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:02.249 [2024-07-25 07:25:34.716596] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:02.249 [2024-07-25 07:25:34.716612] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:02.249 [2024-07-25 07:25:34.716717] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13b34f0 00:20:02.249 [2024-07-25 07:25:34.716726] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:02.249 [2024-07-25 07:25:34.716875] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13ae060 00:20:02.249 [2024-07-25 07:25:34.716992] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13b34f0 00:20:02.249 [2024-07-25 07:25:34.717001] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13b34f0 00:20:02.249 [2024-07-25 07:25:34.717088] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.249 pt4 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.249 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.507 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.507 "name": "raid_bdev1", 00:20:02.507 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:20:02.507 "strip_size_kb": 64, 00:20:02.507 "state": "online", 00:20:02.507 "raid_level": "raid0", 00:20:02.507 "superblock": true, 00:20:02.507 "num_base_bdevs": 4, 00:20:02.507 "num_base_bdevs_discovered": 4, 00:20:02.507 "num_base_bdevs_operational": 4, 00:20:02.507 "base_bdevs_list": [ 00:20:02.507 { 00:20:02.507 "name": "pt1", 00:20:02.507 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:02.507 "is_configured": true, 00:20:02.507 "data_offset": 2048, 00:20:02.507 "data_size": 63488 00:20:02.507 }, 00:20:02.507 { 00:20:02.507 "name": "pt2", 00:20:02.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:02.507 "is_configured": true, 00:20:02.507 "data_offset": 2048, 00:20:02.507 "data_size": 63488 00:20:02.507 }, 00:20:02.507 { 00:20:02.507 "name": "pt3", 00:20:02.507 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:02.507 "is_configured": true, 00:20:02.507 "data_offset": 2048, 00:20:02.507 "data_size": 63488 00:20:02.507 }, 00:20:02.507 { 00:20:02.507 "name": "pt4", 00:20:02.507 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:02.507 "is_configured": true, 00:20:02.507 "data_offset": 2048, 00:20:02.507 "data_size": 63488 00:20:02.507 } 00:20:02.507 ] 00:20:02.507 }' 00:20:02.507 07:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.507 07:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:03.073 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:03.330 [2024-07-25 07:25:35.739235] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:03.330 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:03.330 "name": "raid_bdev1", 00:20:03.330 "aliases": [ 00:20:03.330 "19b304d9-66d0-42bd-a924-f37509d92046" 00:20:03.330 ], 00:20:03.330 "product_name": "Raid Volume", 00:20:03.330 "block_size": 512, 00:20:03.330 "num_blocks": 253952, 00:20:03.330 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:20:03.330 "assigned_rate_limits": { 00:20:03.330 "rw_ios_per_sec": 0, 00:20:03.330 "rw_mbytes_per_sec": 0, 00:20:03.330 "r_mbytes_per_sec": 0, 00:20:03.330 "w_mbytes_per_sec": 0 00:20:03.330 }, 00:20:03.330 "claimed": false, 00:20:03.330 "zoned": false, 00:20:03.330 "supported_io_types": { 00:20:03.330 "read": true, 00:20:03.330 "write": true, 00:20:03.330 "unmap": true, 00:20:03.330 "flush": true, 00:20:03.330 "reset": true, 00:20:03.330 "nvme_admin": false, 00:20:03.330 "nvme_io": false, 00:20:03.330 "nvme_io_md": false, 00:20:03.330 "write_zeroes": true, 00:20:03.330 "zcopy": false, 00:20:03.330 "get_zone_info": false, 00:20:03.330 "zone_management": false, 00:20:03.330 "zone_append": false, 00:20:03.330 "compare": false, 00:20:03.330 "compare_and_write": false, 00:20:03.330 "abort": false, 00:20:03.330 "seek_hole": false, 00:20:03.330 "seek_data": false, 00:20:03.330 "copy": false, 00:20:03.330 "nvme_iov_md": false 00:20:03.330 }, 00:20:03.330 "memory_domains": [ 00:20:03.330 { 00:20:03.330 "dma_device_id": "system", 00:20:03.330 "dma_device_type": 1 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.330 "dma_device_type": 2 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "system", 00:20:03.330 "dma_device_type": 1 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.330 "dma_device_type": 2 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "system", 00:20:03.330 "dma_device_type": 1 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.330 "dma_device_type": 2 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "system", 00:20:03.330 "dma_device_type": 1 00:20:03.330 }, 00:20:03.330 { 00:20:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.330 "dma_device_type": 2 00:20:03.330 } 00:20:03.330 ], 00:20:03.330 "driver_specific": { 00:20:03.330 "raid": { 00:20:03.330 "uuid": "19b304d9-66d0-42bd-a924-f37509d92046", 00:20:03.330 "strip_size_kb": 64, 00:20:03.330 "state": "online", 00:20:03.330 "raid_level": "raid0", 00:20:03.330 "superblock": true, 00:20:03.330 "num_base_bdevs": 4, 00:20:03.330 "num_base_bdevs_discovered": 4, 00:20:03.330 "num_base_bdevs_operational": 4, 00:20:03.330 "base_bdevs_list": [ 00:20:03.330 { 00:20:03.331 "name": "pt1", 00:20:03.331 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:03.331 "is_configured": true, 00:20:03.331 "data_offset": 2048, 00:20:03.331 "data_size": 63488 00:20:03.331 }, 00:20:03.331 { 00:20:03.331 "name": "pt2", 00:20:03.331 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:03.331 "is_configured": true, 00:20:03.331 "data_offset": 2048, 00:20:03.331 "data_size": 63488 00:20:03.331 }, 00:20:03.331 { 00:20:03.331 "name": "pt3", 00:20:03.331 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:03.331 "is_configured": true, 00:20:03.331 "data_offset": 2048, 00:20:03.331 "data_size": 63488 00:20:03.331 }, 00:20:03.331 { 00:20:03.331 "name": "pt4", 00:20:03.331 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:03.331 "is_configured": true, 00:20:03.331 "data_offset": 2048, 00:20:03.331 "data_size": 63488 00:20:03.331 } 00:20:03.331 ] 00:20:03.331 } 00:20:03.331 } 00:20:03.331 }' 00:20:03.331 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:03.331 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:03.331 pt2 00:20:03.331 pt3 00:20:03.331 pt4' 00:20:03.331 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:03.331 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:03.331 07:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:03.588 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:03.588 "name": "pt1", 00:20:03.588 "aliases": [ 00:20:03.588 "00000000-0000-0000-0000-000000000001" 00:20:03.588 ], 00:20:03.588 "product_name": "passthru", 00:20:03.588 "block_size": 512, 00:20:03.588 "num_blocks": 65536, 00:20:03.588 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:03.588 "assigned_rate_limits": { 00:20:03.588 "rw_ios_per_sec": 0, 00:20:03.588 "rw_mbytes_per_sec": 0, 00:20:03.588 "r_mbytes_per_sec": 0, 00:20:03.588 "w_mbytes_per_sec": 0 00:20:03.588 }, 00:20:03.588 "claimed": true, 00:20:03.588 "claim_type": "exclusive_write", 00:20:03.588 "zoned": false, 00:20:03.588 "supported_io_types": { 00:20:03.588 "read": true, 00:20:03.588 "write": true, 00:20:03.588 "unmap": true, 00:20:03.588 "flush": true, 00:20:03.588 "reset": true, 00:20:03.588 "nvme_admin": false, 00:20:03.588 "nvme_io": false, 00:20:03.588 "nvme_io_md": false, 00:20:03.588 "write_zeroes": true, 00:20:03.588 "zcopy": true, 00:20:03.588 "get_zone_info": false, 00:20:03.588 "zone_management": false, 00:20:03.588 "zone_append": false, 00:20:03.588 "compare": false, 00:20:03.588 "compare_and_write": false, 00:20:03.588 "abort": true, 00:20:03.588 "seek_hole": false, 00:20:03.588 "seek_data": false, 00:20:03.588 "copy": true, 00:20:03.588 "nvme_iov_md": false 00:20:03.588 }, 00:20:03.588 "memory_domains": [ 00:20:03.588 { 00:20:03.588 "dma_device_id": "system", 00:20:03.588 "dma_device_type": 1 00:20:03.588 }, 00:20:03.588 { 00:20:03.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.588 "dma_device_type": 2 00:20:03.588 } 00:20:03.588 ], 00:20:03.588 "driver_specific": { 00:20:03.588 "passthru": { 00:20:03.588 "name": "pt1", 00:20:03.588 "base_bdev_name": "malloc1" 00:20:03.588 } 00:20:03.588 } 00:20:03.588 }' 00:20:03.588 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.588 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.588 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:03.846 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.104 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.104 "name": "pt2", 00:20:04.104 "aliases": [ 00:20:04.104 "00000000-0000-0000-0000-000000000002" 00:20:04.104 ], 00:20:04.104 "product_name": "passthru", 00:20:04.104 "block_size": 512, 00:20:04.104 "num_blocks": 65536, 00:20:04.104 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:04.104 "assigned_rate_limits": { 00:20:04.104 "rw_ios_per_sec": 0, 00:20:04.104 "rw_mbytes_per_sec": 0, 00:20:04.104 "r_mbytes_per_sec": 0, 00:20:04.104 "w_mbytes_per_sec": 0 00:20:04.104 }, 00:20:04.104 "claimed": true, 00:20:04.104 "claim_type": "exclusive_write", 00:20:04.104 "zoned": false, 00:20:04.104 "supported_io_types": { 00:20:04.104 "read": true, 00:20:04.104 "write": true, 00:20:04.104 "unmap": true, 00:20:04.104 "flush": true, 00:20:04.104 "reset": true, 00:20:04.104 "nvme_admin": false, 00:20:04.104 "nvme_io": false, 00:20:04.104 "nvme_io_md": false, 00:20:04.104 "write_zeroes": true, 00:20:04.104 "zcopy": true, 00:20:04.104 "get_zone_info": false, 00:20:04.104 "zone_management": false, 00:20:04.104 "zone_append": false, 00:20:04.104 "compare": false, 00:20:04.104 "compare_and_write": false, 00:20:04.104 "abort": true, 00:20:04.104 "seek_hole": false, 00:20:04.104 "seek_data": false, 00:20:04.104 "copy": true, 00:20:04.104 "nvme_iov_md": false 00:20:04.104 }, 00:20:04.104 "memory_domains": [ 00:20:04.104 { 00:20:04.104 "dma_device_id": "system", 00:20:04.104 "dma_device_type": 1 00:20:04.104 }, 00:20:04.104 { 00:20:04.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.104 "dma_device_type": 2 00:20:04.104 } 00:20:04.104 ], 00:20:04.104 "driver_specific": { 00:20:04.104 "passthru": { 00:20:04.104 "name": "pt2", 00:20:04.104 "base_bdev_name": "malloc2" 00:20:04.104 } 00:20:04.104 } 00:20:04.104 }' 00:20:04.104 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:04.361 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.618 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.619 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:04.619 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:04.619 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:04.619 07:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.876 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.876 "name": "pt3", 00:20:04.876 "aliases": [ 00:20:04.876 "00000000-0000-0000-0000-000000000003" 00:20:04.876 ], 00:20:04.876 "product_name": "passthru", 00:20:04.876 "block_size": 512, 00:20:04.876 "num_blocks": 65536, 00:20:04.876 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:04.876 "assigned_rate_limits": { 00:20:04.876 "rw_ios_per_sec": 0, 00:20:04.876 "rw_mbytes_per_sec": 0, 00:20:04.876 "r_mbytes_per_sec": 0, 00:20:04.876 "w_mbytes_per_sec": 0 00:20:04.876 }, 00:20:04.876 "claimed": true, 00:20:04.876 "claim_type": "exclusive_write", 00:20:04.876 "zoned": false, 00:20:04.876 "supported_io_types": { 00:20:04.876 "read": true, 00:20:04.876 "write": true, 00:20:04.876 "unmap": true, 00:20:04.876 "flush": true, 00:20:04.876 "reset": true, 00:20:04.876 "nvme_admin": false, 00:20:04.876 "nvme_io": false, 00:20:04.876 "nvme_io_md": false, 00:20:04.876 "write_zeroes": true, 00:20:04.876 "zcopy": true, 00:20:04.876 "get_zone_info": false, 00:20:04.876 "zone_management": false, 00:20:04.876 "zone_append": false, 00:20:04.876 "compare": false, 00:20:04.876 "compare_and_write": false, 00:20:04.877 "abort": true, 00:20:04.877 "seek_hole": false, 00:20:04.877 "seek_data": false, 00:20:04.877 "copy": true, 00:20:04.877 "nvme_iov_md": false 00:20:04.877 }, 00:20:04.877 "memory_domains": [ 00:20:04.877 { 00:20:04.877 "dma_device_id": "system", 00:20:04.877 "dma_device_type": 1 00:20:04.877 }, 00:20:04.877 { 00:20:04.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.877 "dma_device_type": 2 00:20:04.877 } 00:20:04.877 ], 00:20:04.877 "driver_specific": { 00:20:04.877 "passthru": { 00:20:04.877 "name": "pt3", 00:20:04.877 "base_bdev_name": "malloc3" 00:20:04.877 } 00:20:04.877 } 00:20:04.877 }' 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.877 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.134 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.135 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.135 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.135 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.135 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:05.135 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:05.135 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:05.393 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:05.393 "name": "pt4", 00:20:05.393 "aliases": [ 00:20:05.393 "00000000-0000-0000-0000-000000000004" 00:20:05.393 ], 00:20:05.393 "product_name": "passthru", 00:20:05.393 "block_size": 512, 00:20:05.393 "num_blocks": 65536, 00:20:05.393 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:05.393 "assigned_rate_limits": { 00:20:05.393 "rw_ios_per_sec": 0, 00:20:05.393 "rw_mbytes_per_sec": 0, 00:20:05.393 "r_mbytes_per_sec": 0, 00:20:05.393 "w_mbytes_per_sec": 0 00:20:05.393 }, 00:20:05.393 "claimed": true, 00:20:05.393 "claim_type": "exclusive_write", 00:20:05.393 "zoned": false, 00:20:05.393 "supported_io_types": { 00:20:05.393 "read": true, 00:20:05.393 "write": true, 00:20:05.393 "unmap": true, 00:20:05.393 "flush": true, 00:20:05.393 "reset": true, 00:20:05.393 "nvme_admin": false, 00:20:05.393 "nvme_io": false, 00:20:05.393 "nvme_io_md": false, 00:20:05.393 "write_zeroes": true, 00:20:05.393 "zcopy": true, 00:20:05.393 "get_zone_info": false, 00:20:05.393 "zone_management": false, 00:20:05.393 "zone_append": false, 00:20:05.393 "compare": false, 00:20:05.393 "compare_and_write": false, 00:20:05.393 "abort": true, 00:20:05.393 "seek_hole": false, 00:20:05.393 "seek_data": false, 00:20:05.393 "copy": true, 00:20:05.393 "nvme_iov_md": false 00:20:05.393 }, 00:20:05.393 "memory_domains": [ 00:20:05.393 { 00:20:05.393 "dma_device_id": "system", 00:20:05.393 "dma_device_type": 1 00:20:05.393 }, 00:20:05.393 { 00:20:05.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.393 "dma_device_type": 2 00:20:05.393 } 00:20:05.394 ], 00:20:05.394 "driver_specific": { 00:20:05.394 "passthru": { 00:20:05.394 "name": "pt4", 00:20:05.394 "base_bdev_name": "malloc4" 00:20:05.394 } 00:20:05.394 } 00:20:05.394 }' 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:05.394 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.652 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.652 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.652 07:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.652 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.652 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.652 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:05.652 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:20:05.910 [2024-07-25 07:25:38.297996] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 19b304d9-66d0-42bd-a924-f37509d92046 '!=' 19b304d9-66d0-42bd-a924-f37509d92046 ']' 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1672427 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1672427 ']' 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1672427 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1672427 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1672427' 00:20:05.910 killing process with pid 1672427 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1672427 00:20:05.910 [2024-07-25 07:25:38.368872] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:05.910 [2024-07-25 07:25:38.368930] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:05.910 [2024-07-25 07:25:38.368991] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:05.910 [2024-07-25 07:25:38.369003] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b34f0 name raid_bdev1, state offline 00:20:05.910 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1672427 00:20:05.910 [2024-07-25 07:25:38.400958] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:06.169 07:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:20:06.169 00:20:06.169 real 0m15.730s 00:20:06.169 user 0m28.418s 00:20:06.169 sys 0m2.811s 00:20:06.169 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:06.169 07:25:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.169 ************************************ 00:20:06.169 END TEST raid_superblock_test 00:20:06.169 ************************************ 00:20:06.169 07:25:38 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:20:06.169 07:25:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:06.169 07:25:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:06.169 07:25:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:06.169 ************************************ 00:20:06.169 START TEST raid_read_error_test 00:20:06.169 ************************************ 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 read 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.o5v5V7xDPh 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1675403 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1675403 /var/tmp/spdk-raid.sock 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1675403 ']' 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:06.169 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:06.170 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:06.170 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:06.170 07:25:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.429 [2024-07-25 07:25:38.748048] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:20:06.429 [2024-07-25 07:25:38.748107] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1675403 ] 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:06.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.429 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:06.429 [2024-07-25 07:25:38.867974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:06.429 [2024-07-25 07:25:38.952517] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:20:06.687 [2024-07-25 07:25:39.013994] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:06.687 [2024-07-25 07:25:39.014032] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:07.286 07:25:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:07.286 07:25:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:20:07.286 07:25:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:07.286 07:25:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:07.286 BaseBdev1_malloc 00:20:07.563 07:25:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:07.563 true 00:20:07.563 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:07.822 [2024-07-25 07:25:40.236470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:07.822 [2024-07-25 07:25:40.236512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.822 [2024-07-25 07:25:40.236530] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b55a50 00:20:07.822 [2024-07-25 07:25:40.236542] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.822 [2024-07-25 07:25:40.238067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.822 [2024-07-25 07:25:40.238095] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:07.822 BaseBdev1 00:20:07.822 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:07.822 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:08.081 BaseBdev2_malloc 00:20:08.081 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:08.339 true 00:20:08.339 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:08.597 [2024-07-25 07:25:40.922700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:08.597 [2024-07-25 07:25:40.922740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:08.597 [2024-07-25 07:25:40.922758] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cfef40 00:20:08.597 [2024-07-25 07:25:40.922769] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:08.597 [2024-07-25 07:25:40.924190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:08.597 [2024-07-25 07:25:40.924217] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:08.597 BaseBdev2 00:20:08.597 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:08.597 07:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:08.855 BaseBdev3_malloc 00:20:08.855 07:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:08.855 true 00:20:09.114 07:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:09.114 [2024-07-25 07:25:41.604814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:09.114 [2024-07-25 07:25:41.604855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.114 [2024-07-25 07:25:41.604872] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d02250 00:20:09.114 [2024-07-25 07:25:41.604883] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.114 [2024-07-25 07:25:41.606348] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.114 [2024-07-25 07:25:41.606374] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:09.114 BaseBdev3 00:20:09.114 07:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:09.114 07:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:09.373 BaseBdev4_malloc 00:20:09.373 07:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:09.631 true 00:20:09.631 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:09.889 [2024-07-25 07:25:42.294829] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:09.889 [2024-07-25 07:25:42.294871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.889 [2024-07-25 07:25:42.294890] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d03b40 00:20:09.889 [2024-07-25 07:25:42.294902] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.889 [2024-07-25 07:25:42.296321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.889 [2024-07-25 07:25:42.296348] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:09.889 BaseBdev4 00:20:09.889 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:10.148 [2024-07-25 07:25:42.519449] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:10.148 [2024-07-25 07:25:42.520624] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:10.148 [2024-07-25 07:25:42.520688] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.148 [2024-07-25 07:25:42.520742] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:10.148 [2024-07-25 07:25:42.520965] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d005d0 00:20:10.148 [2024-07-25 07:25:42.520976] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:10.148 [2024-07-25 07:25:42.521158] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b51c70 00:20:10.148 [2024-07-25 07:25:42.521295] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d005d0 00:20:10.148 [2024-07-25 07:25:42.521305] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d005d0 00:20:10.148 [2024-07-25 07:25:42.521397] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.148 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.407 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.407 "name": "raid_bdev1", 00:20:10.407 "uuid": "f82bf9e3-0870-4d7c-8ffe-375cd8e58071", 00:20:10.407 "strip_size_kb": 64, 00:20:10.407 "state": "online", 00:20:10.407 "raid_level": "raid0", 00:20:10.407 "superblock": true, 00:20:10.407 "num_base_bdevs": 4, 00:20:10.407 "num_base_bdevs_discovered": 4, 00:20:10.407 "num_base_bdevs_operational": 4, 00:20:10.407 "base_bdevs_list": [ 00:20:10.407 { 00:20:10.407 "name": "BaseBdev1", 00:20:10.407 "uuid": "4ebe88a2-2038-5820-9a81-6e5c1c606dc2", 00:20:10.407 "is_configured": true, 00:20:10.407 "data_offset": 2048, 00:20:10.407 "data_size": 63488 00:20:10.407 }, 00:20:10.407 { 00:20:10.407 "name": "BaseBdev2", 00:20:10.407 "uuid": "a3d44001-fb29-5195-bc92-b10b0d348e6e", 00:20:10.407 "is_configured": true, 00:20:10.407 "data_offset": 2048, 00:20:10.407 "data_size": 63488 00:20:10.407 }, 00:20:10.407 { 00:20:10.407 "name": "BaseBdev3", 00:20:10.407 "uuid": "424d348b-110d-52cc-8e45-6fbdc4428804", 00:20:10.407 "is_configured": true, 00:20:10.407 "data_offset": 2048, 00:20:10.407 "data_size": 63488 00:20:10.407 }, 00:20:10.407 { 00:20:10.407 "name": "BaseBdev4", 00:20:10.407 "uuid": "36d27e8a-eb8c-5aa2-8932-6bebee762122", 00:20:10.407 "is_configured": true, 00:20:10.407 "data_offset": 2048, 00:20:10.407 "data_size": 63488 00:20:10.407 } 00:20:10.407 ] 00:20:10.407 }' 00:20:10.407 07:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.407 07:25:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.976 07:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:10.976 07:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:10.976 [2024-07-25 07:25:43.414036] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf4f60 00:20:11.914 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.174 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.433 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.433 "name": "raid_bdev1", 00:20:12.433 "uuid": "f82bf9e3-0870-4d7c-8ffe-375cd8e58071", 00:20:12.433 "strip_size_kb": 64, 00:20:12.433 "state": "online", 00:20:12.433 "raid_level": "raid0", 00:20:12.433 "superblock": true, 00:20:12.433 "num_base_bdevs": 4, 00:20:12.433 "num_base_bdevs_discovered": 4, 00:20:12.433 "num_base_bdevs_operational": 4, 00:20:12.433 "base_bdevs_list": [ 00:20:12.433 { 00:20:12.433 "name": "BaseBdev1", 00:20:12.433 "uuid": "4ebe88a2-2038-5820-9a81-6e5c1c606dc2", 00:20:12.433 "is_configured": true, 00:20:12.433 "data_offset": 2048, 00:20:12.433 "data_size": 63488 00:20:12.433 }, 00:20:12.433 { 00:20:12.433 "name": "BaseBdev2", 00:20:12.433 "uuid": "a3d44001-fb29-5195-bc92-b10b0d348e6e", 00:20:12.433 "is_configured": true, 00:20:12.433 "data_offset": 2048, 00:20:12.433 "data_size": 63488 00:20:12.433 }, 00:20:12.433 { 00:20:12.433 "name": "BaseBdev3", 00:20:12.433 "uuid": "424d348b-110d-52cc-8e45-6fbdc4428804", 00:20:12.433 "is_configured": true, 00:20:12.433 "data_offset": 2048, 00:20:12.433 "data_size": 63488 00:20:12.433 }, 00:20:12.433 { 00:20:12.433 "name": "BaseBdev4", 00:20:12.433 "uuid": "36d27e8a-eb8c-5aa2-8932-6bebee762122", 00:20:12.433 "is_configured": true, 00:20:12.433 "data_offset": 2048, 00:20:12.433 "data_size": 63488 00:20:12.433 } 00:20:12.433 ] 00:20:12.433 }' 00:20:12.433 07:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.433 07:25:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.002 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:13.002 [2024-07-25 07:25:45.524802] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:13.002 [2024-07-25 07:25:45.524842] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:13.002 [2024-07-25 07:25:45.527743] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.002 [2024-07-25 07:25:45.527779] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:13.002 [2024-07-25 07:25:45.527813] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:13.002 [2024-07-25 07:25:45.527830] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d005d0 name raid_bdev1, state offline 00:20:13.002 0 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1675403 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1675403 ']' 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1675403 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1675403 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1675403' 00:20:13.261 killing process with pid 1675403 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1675403 00:20:13.261 [2024-07-25 07:25:45.601149] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:13.261 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1675403 00:20:13.261 [2024-07-25 07:25:45.627589] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.o5v5V7xDPh 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.48 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.48 != \0\.\0\0 ]] 00:20:13.520 00:20:13.520 real 0m7.160s 00:20:13.520 user 0m11.407s 00:20:13.520 sys 0m1.208s 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:13.520 07:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.520 ************************************ 00:20:13.520 END TEST raid_read_error_test 00:20:13.520 ************************************ 00:20:13.520 07:25:45 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:20:13.520 07:25:45 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:13.520 07:25:45 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:13.520 07:25:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:13.520 ************************************ 00:20:13.520 START TEST raid_write_error_test 00:20:13.520 ************************************ 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 write 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:20:13.520 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.oakHVXRy80 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1676578 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1676578 /var/tmp/spdk-raid.sock 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1676578 ']' 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:13.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:13.521 07:25:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.521 [2024-07-25 07:25:45.991593] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:20:13.521 [2024-07-25 07:25:45.991654] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1676578 ] 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:13.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.780 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:13.780 [2024-07-25 07:25:46.123226] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:13.780 [2024-07-25 07:25:46.210548] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:20:13.780 [2024-07-25 07:25:46.274393] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:13.780 [2024-07-25 07:25:46.274430] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:14.716 07:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:14.716 07:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:20:14.716 07:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:14.716 07:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:14.716 BaseBdev1_malloc 00:20:14.716 07:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:14.974 true 00:20:14.974 07:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:15.234 [2024-07-25 07:25:47.543592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:15.234 [2024-07-25 07:25:47.543630] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.234 [2024-07-25 07:25:47.543648] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f66a50 00:20:15.234 [2024-07-25 07:25:47.543660] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.234 [2024-07-25 07:25:47.545087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.234 [2024-07-25 07:25:47.545115] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:15.234 BaseBdev1 00:20:15.234 07:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:15.234 07:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:15.493 BaseBdev2_malloc 00:20:15.493 07:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:15.493 true 00:20:15.493 07:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:15.752 [2024-07-25 07:25:48.189636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:15.752 [2024-07-25 07:25:48.189678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.752 [2024-07-25 07:25:48.189696] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210ff40 00:20:15.752 [2024-07-25 07:25:48.189708] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.752 [2024-07-25 07:25:48.191121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.752 [2024-07-25 07:25:48.191155] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:15.752 BaseBdev2 00:20:15.752 07:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:15.752 07:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:16.011 BaseBdev3_malloc 00:20:16.011 07:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:16.270 true 00:20:16.270 07:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:16.530 [2024-07-25 07:25:48.879675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:16.530 [2024-07-25 07:25:48.879714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.530 [2024-07-25 07:25:48.879732] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2113250 00:20:16.530 [2024-07-25 07:25:48.879744] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.530 [2024-07-25 07:25:48.881150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.530 [2024-07-25 07:25:48.881178] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:16.530 BaseBdev3 00:20:16.530 07:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:16.530 07:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:16.789 BaseBdev4_malloc 00:20:16.789 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:17.048 true 00:20:17.048 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:17.048 [2024-07-25 07:25:49.557825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:17.048 [2024-07-25 07:25:49.557863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:17.048 [2024-07-25 07:25:49.557882] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2114b40 00:20:17.048 [2024-07-25 07:25:49.557893] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:17.048 [2024-07-25 07:25:49.559224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:17.048 [2024-07-25 07:25:49.559250] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:17.048 BaseBdev4 00:20:17.049 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:17.308 [2024-07-25 07:25:49.782448] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:17.308 [2024-07-25 07:25:49.783627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:17.308 [2024-07-25 07:25:49.783690] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:17.308 [2024-07-25 07:25:49.783743] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:17.308 [2024-07-25 07:25:49.783966] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21115d0 00:20:17.308 [2024-07-25 07:25:49.783977] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:17.308 [2024-07-25 07:25:49.784160] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f62c70 00:20:17.308 [2024-07-25 07:25:49.784296] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21115d0 00:20:17.308 [2024-07-25 07:25:49.784305] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21115d0 00:20:17.308 [2024-07-25 07:25:49.784398] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.308 07:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.567 07:25:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.567 "name": "raid_bdev1", 00:20:17.567 "uuid": "40d50a7c-7f3a-4126-833c-30aba39038a1", 00:20:17.567 "strip_size_kb": 64, 00:20:17.567 "state": "online", 00:20:17.567 "raid_level": "raid0", 00:20:17.567 "superblock": true, 00:20:17.567 "num_base_bdevs": 4, 00:20:17.567 "num_base_bdevs_discovered": 4, 00:20:17.567 "num_base_bdevs_operational": 4, 00:20:17.567 "base_bdevs_list": [ 00:20:17.567 { 00:20:17.567 "name": "BaseBdev1", 00:20:17.567 "uuid": "5eb2e5a6-2c5e-5e86-9c70-143e2cd38b44", 00:20:17.567 "is_configured": true, 00:20:17.567 "data_offset": 2048, 00:20:17.567 "data_size": 63488 00:20:17.567 }, 00:20:17.567 { 00:20:17.567 "name": "BaseBdev2", 00:20:17.567 "uuid": "f2bb8a5b-002d-5fa2-b868-cd35c84c02c3", 00:20:17.567 "is_configured": true, 00:20:17.567 "data_offset": 2048, 00:20:17.567 "data_size": 63488 00:20:17.567 }, 00:20:17.567 { 00:20:17.567 "name": "BaseBdev3", 00:20:17.567 "uuid": "9183ad43-f5ea-5f1c-8710-ef5fe282af46", 00:20:17.567 "is_configured": true, 00:20:17.567 "data_offset": 2048, 00:20:17.567 "data_size": 63488 00:20:17.567 }, 00:20:17.567 { 00:20:17.567 "name": "BaseBdev4", 00:20:17.567 "uuid": "dad05a9e-f221-53b4-b1c8-7e85e08dffdf", 00:20:17.567 "is_configured": true, 00:20:17.567 "data_offset": 2048, 00:20:17.567 "data_size": 63488 00:20:17.567 } 00:20:17.567 ] 00:20:17.567 }' 00:20:17.567 07:25:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.567 07:25:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:18.135 07:25:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:18.135 07:25:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:18.393 [2024-07-25 07:25:50.709130] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2005f60 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:19.330 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.331 07:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:19.590 07:25:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.590 "name": "raid_bdev1", 00:20:19.590 "uuid": "40d50a7c-7f3a-4126-833c-30aba39038a1", 00:20:19.590 "strip_size_kb": 64, 00:20:19.590 "state": "online", 00:20:19.590 "raid_level": "raid0", 00:20:19.590 "superblock": true, 00:20:19.590 "num_base_bdevs": 4, 00:20:19.590 "num_base_bdevs_discovered": 4, 00:20:19.590 "num_base_bdevs_operational": 4, 00:20:19.590 "base_bdevs_list": [ 00:20:19.590 { 00:20:19.590 "name": "BaseBdev1", 00:20:19.590 "uuid": "5eb2e5a6-2c5e-5e86-9c70-143e2cd38b44", 00:20:19.590 "is_configured": true, 00:20:19.590 "data_offset": 2048, 00:20:19.590 "data_size": 63488 00:20:19.590 }, 00:20:19.590 { 00:20:19.590 "name": "BaseBdev2", 00:20:19.590 "uuid": "f2bb8a5b-002d-5fa2-b868-cd35c84c02c3", 00:20:19.590 "is_configured": true, 00:20:19.590 "data_offset": 2048, 00:20:19.590 "data_size": 63488 00:20:19.590 }, 00:20:19.590 { 00:20:19.590 "name": "BaseBdev3", 00:20:19.590 "uuid": "9183ad43-f5ea-5f1c-8710-ef5fe282af46", 00:20:19.590 "is_configured": true, 00:20:19.590 "data_offset": 2048, 00:20:19.590 "data_size": 63488 00:20:19.590 }, 00:20:19.590 { 00:20:19.590 "name": "BaseBdev4", 00:20:19.590 "uuid": "dad05a9e-f221-53b4-b1c8-7e85e08dffdf", 00:20:19.590 "is_configured": true, 00:20:19.590 "data_offset": 2048, 00:20:19.590 "data_size": 63488 00:20:19.590 } 00:20:19.590 ] 00:20:19.590 }' 00:20:19.590 07:25:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.590 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.158 07:25:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:20.417 [2024-07-25 07:25:52.863865] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:20.417 [2024-07-25 07:25:52.863897] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:20.417 [2024-07-25 07:25:52.866805] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:20.417 [2024-07-25 07:25:52.866842] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.417 [2024-07-25 07:25:52.866878] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:20.417 [2024-07-25 07:25:52.866888] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21115d0 name raid_bdev1, state offline 00:20:20.417 0 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1676578 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1676578 ']' 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1676578 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1676578 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1676578' 00:20:20.417 killing process with pid 1676578 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1676578 00:20:20.417 [2024-07-25 07:25:52.939432] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:20.417 07:25:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1676578 00:20:20.712 [2024-07-25 07:25:52.967350] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.oakHVXRy80 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:20:20.712 00:20:20.712 real 0m7.258s 00:20:20.712 user 0m11.559s 00:20:20.712 sys 0m1.262s 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:20.712 07:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.712 ************************************ 00:20:20.712 END TEST raid_write_error_test 00:20:20.712 ************************************ 00:20:20.712 07:25:53 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:20:20.712 07:25:53 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:20:20.712 07:25:53 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:20.712 07:25:53 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:20.712 07:25:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:20.972 ************************************ 00:20:20.972 START TEST raid_state_function_test 00:20:20.972 ************************************ 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 false 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1677988 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1677988' 00:20:20.972 Process raid pid: 1677988 00:20:20.972 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1677988 /var/tmp/spdk-raid.sock 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1677988 ']' 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:20.973 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:20.973 07:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.973 [2024-07-25 07:25:53.332810] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:20:20.973 [2024-07-25 07:25:53.332874] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:20.973 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:20.973 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:20.973 [2024-07-25 07:25:53.465843] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:21.232 [2024-07-25 07:25:53.548179] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:20:21.232 [2024-07-25 07:25:53.609065] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:21.232 [2024-07-25 07:25:53.609101] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:21.799 07:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:21.799 07:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:20:21.799 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:22.058 [2024-07-25 07:25:54.388656] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:22.058 [2024-07-25 07:25:54.388694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:22.058 [2024-07-25 07:25:54.388705] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:22.058 [2024-07-25 07:25:54.388715] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:22.058 [2024-07-25 07:25:54.388723] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:22.058 [2024-07-25 07:25:54.388734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:22.058 [2024-07-25 07:25:54.388741] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:22.058 [2024-07-25 07:25:54.388752] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.058 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.317 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.317 "name": "Existed_Raid", 00:20:22.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.317 "strip_size_kb": 64, 00:20:22.317 "state": "configuring", 00:20:22.317 "raid_level": "concat", 00:20:22.317 "superblock": false, 00:20:22.317 "num_base_bdevs": 4, 00:20:22.317 "num_base_bdevs_discovered": 0, 00:20:22.317 "num_base_bdevs_operational": 4, 00:20:22.317 "base_bdevs_list": [ 00:20:22.317 { 00:20:22.317 "name": "BaseBdev1", 00:20:22.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.317 "is_configured": false, 00:20:22.317 "data_offset": 0, 00:20:22.317 "data_size": 0 00:20:22.317 }, 00:20:22.317 { 00:20:22.317 "name": "BaseBdev2", 00:20:22.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.317 "is_configured": false, 00:20:22.317 "data_offset": 0, 00:20:22.317 "data_size": 0 00:20:22.317 }, 00:20:22.317 { 00:20:22.317 "name": "BaseBdev3", 00:20:22.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.317 "is_configured": false, 00:20:22.317 "data_offset": 0, 00:20:22.317 "data_size": 0 00:20:22.317 }, 00:20:22.317 { 00:20:22.317 "name": "BaseBdev4", 00:20:22.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.317 "is_configured": false, 00:20:22.317 "data_offset": 0, 00:20:22.317 "data_size": 0 00:20:22.317 } 00:20:22.317 ] 00:20:22.317 }' 00:20:22.317 07:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.317 07:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.885 07:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:22.885 [2024-07-25 07:25:55.419256] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:22.885 [2024-07-25 07:25:55.419284] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x95aee0 name Existed_Raid, state configuring 00:20:23.144 07:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:23.144 [2024-07-25 07:25:55.651880] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:23.144 [2024-07-25 07:25:55.651906] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:23.144 [2024-07-25 07:25:55.651915] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:23.144 [2024-07-25 07:25:55.651925] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:23.144 [2024-07-25 07:25:55.651933] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:23.144 [2024-07-25 07:25:55.651942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:23.144 [2024-07-25 07:25:55.651950] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:23.144 [2024-07-25 07:25:55.651960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:23.144 07:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:23.403 [2024-07-25 07:25:55.894244] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:23.403 BaseBdev1 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:23.403 07:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:23.662 07:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:23.920 [ 00:20:23.920 { 00:20:23.920 "name": "BaseBdev1", 00:20:23.920 "aliases": [ 00:20:23.920 "46c2017d-e0d5-4cdf-89b5-5d4f49226afa" 00:20:23.920 ], 00:20:23.920 "product_name": "Malloc disk", 00:20:23.920 "block_size": 512, 00:20:23.920 "num_blocks": 65536, 00:20:23.920 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:23.920 "assigned_rate_limits": { 00:20:23.920 "rw_ios_per_sec": 0, 00:20:23.920 "rw_mbytes_per_sec": 0, 00:20:23.920 "r_mbytes_per_sec": 0, 00:20:23.920 "w_mbytes_per_sec": 0 00:20:23.920 }, 00:20:23.920 "claimed": true, 00:20:23.920 "claim_type": "exclusive_write", 00:20:23.920 "zoned": false, 00:20:23.920 "supported_io_types": { 00:20:23.920 "read": true, 00:20:23.920 "write": true, 00:20:23.920 "unmap": true, 00:20:23.920 "flush": true, 00:20:23.920 "reset": true, 00:20:23.920 "nvme_admin": false, 00:20:23.920 "nvme_io": false, 00:20:23.920 "nvme_io_md": false, 00:20:23.920 "write_zeroes": true, 00:20:23.920 "zcopy": true, 00:20:23.920 "get_zone_info": false, 00:20:23.920 "zone_management": false, 00:20:23.920 "zone_append": false, 00:20:23.920 "compare": false, 00:20:23.920 "compare_and_write": false, 00:20:23.920 "abort": true, 00:20:23.920 "seek_hole": false, 00:20:23.920 "seek_data": false, 00:20:23.920 "copy": true, 00:20:23.920 "nvme_iov_md": false 00:20:23.920 }, 00:20:23.920 "memory_domains": [ 00:20:23.920 { 00:20:23.920 "dma_device_id": "system", 00:20:23.920 "dma_device_type": 1 00:20:23.920 }, 00:20:23.920 { 00:20:23.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.920 "dma_device_type": 2 00:20:23.920 } 00:20:23.920 ], 00:20:23.920 "driver_specific": {} 00:20:23.920 } 00:20:23.920 ] 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.920 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.179 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.179 "name": "Existed_Raid", 00:20:24.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.179 "strip_size_kb": 64, 00:20:24.179 "state": "configuring", 00:20:24.179 "raid_level": "concat", 00:20:24.179 "superblock": false, 00:20:24.179 "num_base_bdevs": 4, 00:20:24.179 "num_base_bdevs_discovered": 1, 00:20:24.179 "num_base_bdevs_operational": 4, 00:20:24.179 "base_bdevs_list": [ 00:20:24.179 { 00:20:24.179 "name": "BaseBdev1", 00:20:24.179 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:24.179 "is_configured": true, 00:20:24.179 "data_offset": 0, 00:20:24.179 "data_size": 65536 00:20:24.179 }, 00:20:24.179 { 00:20:24.179 "name": "BaseBdev2", 00:20:24.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.179 "is_configured": false, 00:20:24.179 "data_offset": 0, 00:20:24.179 "data_size": 0 00:20:24.179 }, 00:20:24.179 { 00:20:24.179 "name": "BaseBdev3", 00:20:24.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.179 "is_configured": false, 00:20:24.179 "data_offset": 0, 00:20:24.179 "data_size": 0 00:20:24.179 }, 00:20:24.179 { 00:20:24.179 "name": "BaseBdev4", 00:20:24.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.179 "is_configured": false, 00:20:24.179 "data_offset": 0, 00:20:24.179 "data_size": 0 00:20:24.179 } 00:20:24.179 ] 00:20:24.179 }' 00:20:24.179 07:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.179 07:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:24.744 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:25.002 [2024-07-25 07:25:57.390279] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:25.002 [2024-07-25 07:25:57.390318] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x95a750 name Existed_Raid, state configuring 00:20:25.002 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:25.260 [2024-07-25 07:25:57.618916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:25.260 [2024-07-25 07:25:57.620321] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:25.260 [2024-07-25 07:25:57.620353] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:25.260 [2024-07-25 07:25:57.620363] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:25.260 [2024-07-25 07:25:57.620377] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:25.260 [2024-07-25 07:25:57.620386] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:25.260 [2024-07-25 07:25:57.620396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.260 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.519 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.519 "name": "Existed_Raid", 00:20:25.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.519 "strip_size_kb": 64, 00:20:25.519 "state": "configuring", 00:20:25.519 "raid_level": "concat", 00:20:25.519 "superblock": false, 00:20:25.519 "num_base_bdevs": 4, 00:20:25.519 "num_base_bdevs_discovered": 1, 00:20:25.519 "num_base_bdevs_operational": 4, 00:20:25.519 "base_bdevs_list": [ 00:20:25.519 { 00:20:25.519 "name": "BaseBdev1", 00:20:25.519 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:25.519 "is_configured": true, 00:20:25.519 "data_offset": 0, 00:20:25.519 "data_size": 65536 00:20:25.519 }, 00:20:25.519 { 00:20:25.519 "name": "BaseBdev2", 00:20:25.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.519 "is_configured": false, 00:20:25.519 "data_offset": 0, 00:20:25.519 "data_size": 0 00:20:25.519 }, 00:20:25.519 { 00:20:25.519 "name": "BaseBdev3", 00:20:25.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.519 "is_configured": false, 00:20:25.519 "data_offset": 0, 00:20:25.519 "data_size": 0 00:20:25.519 }, 00:20:25.519 { 00:20:25.519 "name": "BaseBdev4", 00:20:25.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.519 "is_configured": false, 00:20:25.519 "data_offset": 0, 00:20:25.519 "data_size": 0 00:20:25.519 } 00:20:25.519 ] 00:20:25.519 }' 00:20:25.519 07:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.519 07:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.086 07:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:26.344 [2024-07-25 07:25:58.640675] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:26.344 BaseBdev2 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:26.345 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:26.603 07:25:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:26.603 [ 00:20:26.603 { 00:20:26.603 "name": "BaseBdev2", 00:20:26.603 "aliases": [ 00:20:26.603 "47ce1787-f513-4f6b-80aa-8c195d793f78" 00:20:26.603 ], 00:20:26.603 "product_name": "Malloc disk", 00:20:26.603 "block_size": 512, 00:20:26.603 "num_blocks": 65536, 00:20:26.603 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:26.603 "assigned_rate_limits": { 00:20:26.603 "rw_ios_per_sec": 0, 00:20:26.603 "rw_mbytes_per_sec": 0, 00:20:26.603 "r_mbytes_per_sec": 0, 00:20:26.603 "w_mbytes_per_sec": 0 00:20:26.603 }, 00:20:26.603 "claimed": true, 00:20:26.603 "claim_type": "exclusive_write", 00:20:26.603 "zoned": false, 00:20:26.603 "supported_io_types": { 00:20:26.603 "read": true, 00:20:26.603 "write": true, 00:20:26.603 "unmap": true, 00:20:26.603 "flush": true, 00:20:26.603 "reset": true, 00:20:26.603 "nvme_admin": false, 00:20:26.603 "nvme_io": false, 00:20:26.603 "nvme_io_md": false, 00:20:26.603 "write_zeroes": true, 00:20:26.603 "zcopy": true, 00:20:26.603 "get_zone_info": false, 00:20:26.603 "zone_management": false, 00:20:26.603 "zone_append": false, 00:20:26.603 "compare": false, 00:20:26.603 "compare_and_write": false, 00:20:26.603 "abort": true, 00:20:26.603 "seek_hole": false, 00:20:26.603 "seek_data": false, 00:20:26.603 "copy": true, 00:20:26.603 "nvme_iov_md": false 00:20:26.603 }, 00:20:26.603 "memory_domains": [ 00:20:26.603 { 00:20:26.603 "dma_device_id": "system", 00:20:26.603 "dma_device_type": 1 00:20:26.603 }, 00:20:26.603 { 00:20:26.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.603 "dma_device_type": 2 00:20:26.603 } 00:20:26.603 ], 00:20:26.603 "driver_specific": {} 00:20:26.603 } 00:20:26.603 ] 00:20:26.603 07:25:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:26.603 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:26.603 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:26.603 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.604 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.860 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.860 "name": "Existed_Raid", 00:20:26.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.860 "strip_size_kb": 64, 00:20:26.860 "state": "configuring", 00:20:26.860 "raid_level": "concat", 00:20:26.860 "superblock": false, 00:20:26.860 "num_base_bdevs": 4, 00:20:26.860 "num_base_bdevs_discovered": 2, 00:20:26.860 "num_base_bdevs_operational": 4, 00:20:26.860 "base_bdevs_list": [ 00:20:26.860 { 00:20:26.860 "name": "BaseBdev1", 00:20:26.860 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:26.860 "is_configured": true, 00:20:26.860 "data_offset": 0, 00:20:26.860 "data_size": 65536 00:20:26.860 }, 00:20:26.860 { 00:20:26.860 "name": "BaseBdev2", 00:20:26.860 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:26.860 "is_configured": true, 00:20:26.860 "data_offset": 0, 00:20:26.860 "data_size": 65536 00:20:26.860 }, 00:20:26.860 { 00:20:26.860 "name": "BaseBdev3", 00:20:26.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.860 "is_configured": false, 00:20:26.860 "data_offset": 0, 00:20:26.860 "data_size": 0 00:20:26.860 }, 00:20:26.860 { 00:20:26.860 "name": "BaseBdev4", 00:20:26.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.860 "is_configured": false, 00:20:26.860 "data_offset": 0, 00:20:26.860 "data_size": 0 00:20:26.860 } 00:20:26.860 ] 00:20:26.860 }' 00:20:26.860 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.860 07:25:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.426 07:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:27.685 [2024-07-25 07:26:00.103624] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:27.685 BaseBdev3 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:27.685 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:27.943 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:28.202 [ 00:20:28.202 { 00:20:28.202 "name": "BaseBdev3", 00:20:28.202 "aliases": [ 00:20:28.202 "10255088-fe99-4a21-9c0d-327a7a0ff64b" 00:20:28.202 ], 00:20:28.202 "product_name": "Malloc disk", 00:20:28.202 "block_size": 512, 00:20:28.202 "num_blocks": 65536, 00:20:28.202 "uuid": "10255088-fe99-4a21-9c0d-327a7a0ff64b", 00:20:28.202 "assigned_rate_limits": { 00:20:28.202 "rw_ios_per_sec": 0, 00:20:28.202 "rw_mbytes_per_sec": 0, 00:20:28.202 "r_mbytes_per_sec": 0, 00:20:28.202 "w_mbytes_per_sec": 0 00:20:28.202 }, 00:20:28.202 "claimed": true, 00:20:28.202 "claim_type": "exclusive_write", 00:20:28.202 "zoned": false, 00:20:28.202 "supported_io_types": { 00:20:28.202 "read": true, 00:20:28.202 "write": true, 00:20:28.202 "unmap": true, 00:20:28.202 "flush": true, 00:20:28.202 "reset": true, 00:20:28.202 "nvme_admin": false, 00:20:28.202 "nvme_io": false, 00:20:28.202 "nvme_io_md": false, 00:20:28.202 "write_zeroes": true, 00:20:28.202 "zcopy": true, 00:20:28.202 "get_zone_info": false, 00:20:28.202 "zone_management": false, 00:20:28.202 "zone_append": false, 00:20:28.202 "compare": false, 00:20:28.202 "compare_and_write": false, 00:20:28.202 "abort": true, 00:20:28.202 "seek_hole": false, 00:20:28.202 "seek_data": false, 00:20:28.202 "copy": true, 00:20:28.202 "nvme_iov_md": false 00:20:28.202 }, 00:20:28.202 "memory_domains": [ 00:20:28.202 { 00:20:28.202 "dma_device_id": "system", 00:20:28.202 "dma_device_type": 1 00:20:28.202 }, 00:20:28.202 { 00:20:28.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.202 "dma_device_type": 2 00:20:28.202 } 00:20:28.202 ], 00:20:28.202 "driver_specific": {} 00:20:28.202 } 00:20:28.202 ] 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.202 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:28.461 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.461 "name": "Existed_Raid", 00:20:28.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.461 "strip_size_kb": 64, 00:20:28.461 "state": "configuring", 00:20:28.461 "raid_level": "concat", 00:20:28.461 "superblock": false, 00:20:28.461 "num_base_bdevs": 4, 00:20:28.461 "num_base_bdevs_discovered": 3, 00:20:28.461 "num_base_bdevs_operational": 4, 00:20:28.461 "base_bdevs_list": [ 00:20:28.461 { 00:20:28.461 "name": "BaseBdev1", 00:20:28.461 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:28.461 "is_configured": true, 00:20:28.461 "data_offset": 0, 00:20:28.461 "data_size": 65536 00:20:28.461 }, 00:20:28.461 { 00:20:28.461 "name": "BaseBdev2", 00:20:28.461 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:28.461 "is_configured": true, 00:20:28.461 "data_offset": 0, 00:20:28.461 "data_size": 65536 00:20:28.461 }, 00:20:28.461 { 00:20:28.461 "name": "BaseBdev3", 00:20:28.461 "uuid": "10255088-fe99-4a21-9c0d-327a7a0ff64b", 00:20:28.461 "is_configured": true, 00:20:28.461 "data_offset": 0, 00:20:28.461 "data_size": 65536 00:20:28.461 }, 00:20:28.461 { 00:20:28.461 "name": "BaseBdev4", 00:20:28.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.461 "is_configured": false, 00:20:28.461 "data_offset": 0, 00:20:28.461 "data_size": 0 00:20:28.461 } 00:20:28.461 ] 00:20:28.461 }' 00:20:28.461 07:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.461 07:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.031 07:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:29.289 [2024-07-25 07:26:01.582657] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:29.289 [2024-07-25 07:26:01.582693] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x95b7b0 00:20:29.289 [2024-07-25 07:26:01.582700] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:29.289 [2024-07-25 07:26:01.582883] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb0e9d0 00:20:29.289 [2024-07-25 07:26:01.583003] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x95b7b0 00:20:29.289 [2024-07-25 07:26:01.583013] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x95b7b0 00:20:29.289 [2024-07-25 07:26:01.583172] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.289 BaseBdev4 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:29.289 07:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:29.547 [ 00:20:29.547 { 00:20:29.547 "name": "BaseBdev4", 00:20:29.547 "aliases": [ 00:20:29.547 "582cde61-37ab-4c25-92e2-c665a27b9014" 00:20:29.547 ], 00:20:29.547 "product_name": "Malloc disk", 00:20:29.547 "block_size": 512, 00:20:29.547 "num_blocks": 65536, 00:20:29.547 "uuid": "582cde61-37ab-4c25-92e2-c665a27b9014", 00:20:29.547 "assigned_rate_limits": { 00:20:29.547 "rw_ios_per_sec": 0, 00:20:29.547 "rw_mbytes_per_sec": 0, 00:20:29.547 "r_mbytes_per_sec": 0, 00:20:29.547 "w_mbytes_per_sec": 0 00:20:29.547 }, 00:20:29.547 "claimed": true, 00:20:29.547 "claim_type": "exclusive_write", 00:20:29.547 "zoned": false, 00:20:29.547 "supported_io_types": { 00:20:29.547 "read": true, 00:20:29.547 "write": true, 00:20:29.547 "unmap": true, 00:20:29.547 "flush": true, 00:20:29.547 "reset": true, 00:20:29.548 "nvme_admin": false, 00:20:29.548 "nvme_io": false, 00:20:29.548 "nvme_io_md": false, 00:20:29.548 "write_zeroes": true, 00:20:29.548 "zcopy": true, 00:20:29.548 "get_zone_info": false, 00:20:29.548 "zone_management": false, 00:20:29.548 "zone_append": false, 00:20:29.548 "compare": false, 00:20:29.548 "compare_and_write": false, 00:20:29.548 "abort": true, 00:20:29.548 "seek_hole": false, 00:20:29.548 "seek_data": false, 00:20:29.548 "copy": true, 00:20:29.548 "nvme_iov_md": false 00:20:29.548 }, 00:20:29.548 "memory_domains": [ 00:20:29.548 { 00:20:29.548 "dma_device_id": "system", 00:20:29.548 "dma_device_type": 1 00:20:29.548 }, 00:20:29.548 { 00:20:29.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.548 "dma_device_type": 2 00:20:29.548 } 00:20:29.548 ], 00:20:29.548 "driver_specific": {} 00:20:29.548 } 00:20:29.548 ] 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.548 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.806 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.806 "name": "Existed_Raid", 00:20:29.806 "uuid": "d078d55d-3e39-4fca-8f7b-32b600f63225", 00:20:29.806 "strip_size_kb": 64, 00:20:29.806 "state": "online", 00:20:29.806 "raid_level": "concat", 00:20:29.806 "superblock": false, 00:20:29.806 "num_base_bdevs": 4, 00:20:29.806 "num_base_bdevs_discovered": 4, 00:20:29.806 "num_base_bdevs_operational": 4, 00:20:29.806 "base_bdevs_list": [ 00:20:29.806 { 00:20:29.806 "name": "BaseBdev1", 00:20:29.807 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:29.807 "is_configured": true, 00:20:29.807 "data_offset": 0, 00:20:29.807 "data_size": 65536 00:20:29.807 }, 00:20:29.807 { 00:20:29.807 "name": "BaseBdev2", 00:20:29.807 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:29.807 "is_configured": true, 00:20:29.807 "data_offset": 0, 00:20:29.807 "data_size": 65536 00:20:29.807 }, 00:20:29.807 { 00:20:29.807 "name": "BaseBdev3", 00:20:29.807 "uuid": "10255088-fe99-4a21-9c0d-327a7a0ff64b", 00:20:29.807 "is_configured": true, 00:20:29.807 "data_offset": 0, 00:20:29.807 "data_size": 65536 00:20:29.807 }, 00:20:29.807 { 00:20:29.807 "name": "BaseBdev4", 00:20:29.807 "uuid": "582cde61-37ab-4c25-92e2-c665a27b9014", 00:20:29.807 "is_configured": true, 00:20:29.807 "data_offset": 0, 00:20:29.807 "data_size": 65536 00:20:29.807 } 00:20:29.807 ] 00:20:29.807 }' 00:20:29.807 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.807 07:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:30.374 07:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:30.633 [2024-07-25 07:26:03.058866] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:30.633 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:30.633 "name": "Existed_Raid", 00:20:30.633 "aliases": [ 00:20:30.633 "d078d55d-3e39-4fca-8f7b-32b600f63225" 00:20:30.633 ], 00:20:30.633 "product_name": "Raid Volume", 00:20:30.633 "block_size": 512, 00:20:30.633 "num_blocks": 262144, 00:20:30.633 "uuid": "d078d55d-3e39-4fca-8f7b-32b600f63225", 00:20:30.633 "assigned_rate_limits": { 00:20:30.633 "rw_ios_per_sec": 0, 00:20:30.633 "rw_mbytes_per_sec": 0, 00:20:30.633 "r_mbytes_per_sec": 0, 00:20:30.633 "w_mbytes_per_sec": 0 00:20:30.633 }, 00:20:30.633 "claimed": false, 00:20:30.633 "zoned": false, 00:20:30.633 "supported_io_types": { 00:20:30.633 "read": true, 00:20:30.633 "write": true, 00:20:30.633 "unmap": true, 00:20:30.633 "flush": true, 00:20:30.633 "reset": true, 00:20:30.633 "nvme_admin": false, 00:20:30.633 "nvme_io": false, 00:20:30.633 "nvme_io_md": false, 00:20:30.633 "write_zeroes": true, 00:20:30.633 "zcopy": false, 00:20:30.633 "get_zone_info": false, 00:20:30.633 "zone_management": false, 00:20:30.633 "zone_append": false, 00:20:30.633 "compare": false, 00:20:30.633 "compare_and_write": false, 00:20:30.633 "abort": false, 00:20:30.633 "seek_hole": false, 00:20:30.633 "seek_data": false, 00:20:30.633 "copy": false, 00:20:30.633 "nvme_iov_md": false 00:20:30.633 }, 00:20:30.633 "memory_domains": [ 00:20:30.633 { 00:20:30.633 "dma_device_id": "system", 00:20:30.633 "dma_device_type": 1 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.633 "dma_device_type": 2 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "system", 00:20:30.633 "dma_device_type": 1 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.633 "dma_device_type": 2 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "system", 00:20:30.633 "dma_device_type": 1 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.633 "dma_device_type": 2 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "system", 00:20:30.633 "dma_device_type": 1 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.633 "dma_device_type": 2 00:20:30.633 } 00:20:30.633 ], 00:20:30.633 "driver_specific": { 00:20:30.633 "raid": { 00:20:30.633 "uuid": "d078d55d-3e39-4fca-8f7b-32b600f63225", 00:20:30.633 "strip_size_kb": 64, 00:20:30.633 "state": "online", 00:20:30.633 "raid_level": "concat", 00:20:30.633 "superblock": false, 00:20:30.633 "num_base_bdevs": 4, 00:20:30.633 "num_base_bdevs_discovered": 4, 00:20:30.633 "num_base_bdevs_operational": 4, 00:20:30.633 "base_bdevs_list": [ 00:20:30.633 { 00:20:30.633 "name": "BaseBdev1", 00:20:30.633 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:30.633 "is_configured": true, 00:20:30.633 "data_offset": 0, 00:20:30.633 "data_size": 65536 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "name": "BaseBdev2", 00:20:30.633 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:30.633 "is_configured": true, 00:20:30.633 "data_offset": 0, 00:20:30.633 "data_size": 65536 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "name": "BaseBdev3", 00:20:30.633 "uuid": "10255088-fe99-4a21-9c0d-327a7a0ff64b", 00:20:30.633 "is_configured": true, 00:20:30.633 "data_offset": 0, 00:20:30.633 "data_size": 65536 00:20:30.633 }, 00:20:30.633 { 00:20:30.633 "name": "BaseBdev4", 00:20:30.633 "uuid": "582cde61-37ab-4c25-92e2-c665a27b9014", 00:20:30.633 "is_configured": true, 00:20:30.633 "data_offset": 0, 00:20:30.633 "data_size": 65536 00:20:30.633 } 00:20:30.633 ] 00:20:30.633 } 00:20:30.633 } 00:20:30.633 }' 00:20:30.633 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:30.633 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:30.633 BaseBdev2 00:20:30.633 BaseBdev3 00:20:30.633 BaseBdev4' 00:20:30.633 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.633 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:30.633 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:30.892 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:30.892 "name": "BaseBdev1", 00:20:30.892 "aliases": [ 00:20:30.892 "46c2017d-e0d5-4cdf-89b5-5d4f49226afa" 00:20:30.892 ], 00:20:30.892 "product_name": "Malloc disk", 00:20:30.892 "block_size": 512, 00:20:30.892 "num_blocks": 65536, 00:20:30.892 "uuid": "46c2017d-e0d5-4cdf-89b5-5d4f49226afa", 00:20:30.892 "assigned_rate_limits": { 00:20:30.892 "rw_ios_per_sec": 0, 00:20:30.892 "rw_mbytes_per_sec": 0, 00:20:30.892 "r_mbytes_per_sec": 0, 00:20:30.892 "w_mbytes_per_sec": 0 00:20:30.892 }, 00:20:30.892 "claimed": true, 00:20:30.892 "claim_type": "exclusive_write", 00:20:30.892 "zoned": false, 00:20:30.892 "supported_io_types": { 00:20:30.892 "read": true, 00:20:30.892 "write": true, 00:20:30.892 "unmap": true, 00:20:30.892 "flush": true, 00:20:30.892 "reset": true, 00:20:30.892 "nvme_admin": false, 00:20:30.892 "nvme_io": false, 00:20:30.892 "nvme_io_md": false, 00:20:30.892 "write_zeroes": true, 00:20:30.892 "zcopy": true, 00:20:30.892 "get_zone_info": false, 00:20:30.892 "zone_management": false, 00:20:30.892 "zone_append": false, 00:20:30.892 "compare": false, 00:20:30.892 "compare_and_write": false, 00:20:30.892 "abort": true, 00:20:30.892 "seek_hole": false, 00:20:30.892 "seek_data": false, 00:20:30.892 "copy": true, 00:20:30.892 "nvme_iov_md": false 00:20:30.892 }, 00:20:30.892 "memory_domains": [ 00:20:30.892 { 00:20:30.892 "dma_device_id": "system", 00:20:30.892 "dma_device_type": 1 00:20:30.892 }, 00:20:30.892 { 00:20:30.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.892 "dma_device_type": 2 00:20:30.892 } 00:20:30.892 ], 00:20:30.892 "driver_specific": {} 00:20:30.892 }' 00:20:30.892 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.892 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.150 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.408 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.408 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.408 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:31.408 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.408 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.408 "name": "BaseBdev2", 00:20:31.408 "aliases": [ 00:20:31.408 "47ce1787-f513-4f6b-80aa-8c195d793f78" 00:20:31.408 ], 00:20:31.408 "product_name": "Malloc disk", 00:20:31.408 "block_size": 512, 00:20:31.408 "num_blocks": 65536, 00:20:31.408 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:31.408 "assigned_rate_limits": { 00:20:31.408 "rw_ios_per_sec": 0, 00:20:31.408 "rw_mbytes_per_sec": 0, 00:20:31.408 "r_mbytes_per_sec": 0, 00:20:31.408 "w_mbytes_per_sec": 0 00:20:31.408 }, 00:20:31.408 "claimed": true, 00:20:31.408 "claim_type": "exclusive_write", 00:20:31.408 "zoned": false, 00:20:31.408 "supported_io_types": { 00:20:31.408 "read": true, 00:20:31.408 "write": true, 00:20:31.408 "unmap": true, 00:20:31.408 "flush": true, 00:20:31.408 "reset": true, 00:20:31.408 "nvme_admin": false, 00:20:31.408 "nvme_io": false, 00:20:31.408 "nvme_io_md": false, 00:20:31.408 "write_zeroes": true, 00:20:31.408 "zcopy": true, 00:20:31.408 "get_zone_info": false, 00:20:31.408 "zone_management": false, 00:20:31.408 "zone_append": false, 00:20:31.408 "compare": false, 00:20:31.408 "compare_and_write": false, 00:20:31.408 "abort": true, 00:20:31.408 "seek_hole": false, 00:20:31.408 "seek_data": false, 00:20:31.408 "copy": true, 00:20:31.408 "nvme_iov_md": false 00:20:31.408 }, 00:20:31.408 "memory_domains": [ 00:20:31.408 { 00:20:31.408 "dma_device_id": "system", 00:20:31.408 "dma_device_type": 1 00:20:31.408 }, 00:20:31.408 { 00:20:31.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.408 "dma_device_type": 2 00:20:31.408 } 00:20:31.408 ], 00:20:31.408 "driver_specific": {} 00:20:31.408 }' 00:20:31.408 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.667 07:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.667 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.926 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.926 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.926 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.926 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:31.926 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.185 "name": "BaseBdev3", 00:20:32.185 "aliases": [ 00:20:32.185 "10255088-fe99-4a21-9c0d-327a7a0ff64b" 00:20:32.185 ], 00:20:32.185 "product_name": "Malloc disk", 00:20:32.185 "block_size": 512, 00:20:32.185 "num_blocks": 65536, 00:20:32.185 "uuid": "10255088-fe99-4a21-9c0d-327a7a0ff64b", 00:20:32.185 "assigned_rate_limits": { 00:20:32.185 "rw_ios_per_sec": 0, 00:20:32.185 "rw_mbytes_per_sec": 0, 00:20:32.185 "r_mbytes_per_sec": 0, 00:20:32.185 "w_mbytes_per_sec": 0 00:20:32.185 }, 00:20:32.185 "claimed": true, 00:20:32.185 "claim_type": "exclusive_write", 00:20:32.185 "zoned": false, 00:20:32.185 "supported_io_types": { 00:20:32.185 "read": true, 00:20:32.185 "write": true, 00:20:32.185 "unmap": true, 00:20:32.185 "flush": true, 00:20:32.185 "reset": true, 00:20:32.185 "nvme_admin": false, 00:20:32.185 "nvme_io": false, 00:20:32.185 "nvme_io_md": false, 00:20:32.185 "write_zeroes": true, 00:20:32.185 "zcopy": true, 00:20:32.185 "get_zone_info": false, 00:20:32.185 "zone_management": false, 00:20:32.185 "zone_append": false, 00:20:32.185 "compare": false, 00:20:32.185 "compare_and_write": false, 00:20:32.185 "abort": true, 00:20:32.185 "seek_hole": false, 00:20:32.185 "seek_data": false, 00:20:32.185 "copy": true, 00:20:32.185 "nvme_iov_md": false 00:20:32.185 }, 00:20:32.185 "memory_domains": [ 00:20:32.185 { 00:20:32.185 "dma_device_id": "system", 00:20:32.185 "dma_device_type": 1 00:20:32.185 }, 00:20:32.185 { 00:20:32.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.185 "dma_device_type": 2 00:20:32.185 } 00:20:32.185 ], 00:20:32.185 "driver_specific": {} 00:20:32.185 }' 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.185 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:32.444 07:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.703 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.703 "name": "BaseBdev4", 00:20:32.703 "aliases": [ 00:20:32.703 "582cde61-37ab-4c25-92e2-c665a27b9014" 00:20:32.703 ], 00:20:32.703 "product_name": "Malloc disk", 00:20:32.703 "block_size": 512, 00:20:32.703 "num_blocks": 65536, 00:20:32.703 "uuid": "582cde61-37ab-4c25-92e2-c665a27b9014", 00:20:32.703 "assigned_rate_limits": { 00:20:32.703 "rw_ios_per_sec": 0, 00:20:32.703 "rw_mbytes_per_sec": 0, 00:20:32.703 "r_mbytes_per_sec": 0, 00:20:32.703 "w_mbytes_per_sec": 0 00:20:32.703 }, 00:20:32.703 "claimed": true, 00:20:32.703 "claim_type": "exclusive_write", 00:20:32.703 "zoned": false, 00:20:32.703 "supported_io_types": { 00:20:32.703 "read": true, 00:20:32.703 "write": true, 00:20:32.703 "unmap": true, 00:20:32.703 "flush": true, 00:20:32.703 "reset": true, 00:20:32.703 "nvme_admin": false, 00:20:32.703 "nvme_io": false, 00:20:32.703 "nvme_io_md": false, 00:20:32.703 "write_zeroes": true, 00:20:32.703 "zcopy": true, 00:20:32.703 "get_zone_info": false, 00:20:32.703 "zone_management": false, 00:20:32.703 "zone_append": false, 00:20:32.703 "compare": false, 00:20:32.703 "compare_and_write": false, 00:20:32.703 "abort": true, 00:20:32.703 "seek_hole": false, 00:20:32.703 "seek_data": false, 00:20:32.703 "copy": true, 00:20:32.703 "nvme_iov_md": false 00:20:32.703 }, 00:20:32.703 "memory_domains": [ 00:20:32.703 { 00:20:32.703 "dma_device_id": "system", 00:20:32.703 "dma_device_type": 1 00:20:32.703 }, 00:20:32.703 { 00:20:32.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.703 "dma_device_type": 2 00:20:32.703 } 00:20:32.703 ], 00:20:32.703 "driver_specific": {} 00:20:32.703 }' 00:20:32.703 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.703 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.703 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.703 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.703 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.962 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:33.221 [2024-07-25 07:26:05.637411] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:33.221 [2024-07-25 07:26:05.637439] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:33.221 [2024-07-25 07:26:05.637486] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.221 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.536 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.536 "name": "Existed_Raid", 00:20:33.536 "uuid": "d078d55d-3e39-4fca-8f7b-32b600f63225", 00:20:33.536 "strip_size_kb": 64, 00:20:33.536 "state": "offline", 00:20:33.536 "raid_level": "concat", 00:20:33.536 "superblock": false, 00:20:33.536 "num_base_bdevs": 4, 00:20:33.536 "num_base_bdevs_discovered": 3, 00:20:33.536 "num_base_bdevs_operational": 3, 00:20:33.536 "base_bdevs_list": [ 00:20:33.536 { 00:20:33.536 "name": null, 00:20:33.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.536 "is_configured": false, 00:20:33.536 "data_offset": 0, 00:20:33.536 "data_size": 65536 00:20:33.536 }, 00:20:33.536 { 00:20:33.536 "name": "BaseBdev2", 00:20:33.536 "uuid": "47ce1787-f513-4f6b-80aa-8c195d793f78", 00:20:33.536 "is_configured": true, 00:20:33.536 "data_offset": 0, 00:20:33.536 "data_size": 65536 00:20:33.536 }, 00:20:33.536 { 00:20:33.536 "name": "BaseBdev3", 00:20:33.536 "uuid": "10255088-fe99-4a21-9c0d-327a7a0ff64b", 00:20:33.536 "is_configured": true, 00:20:33.536 "data_offset": 0, 00:20:33.536 "data_size": 65536 00:20:33.536 }, 00:20:33.536 { 00:20:33.536 "name": "BaseBdev4", 00:20:33.536 "uuid": "582cde61-37ab-4c25-92e2-c665a27b9014", 00:20:33.536 "is_configured": true, 00:20:33.536 "data_offset": 0, 00:20:33.536 "data_size": 65536 00:20:33.536 } 00:20:33.536 ] 00:20:33.537 }' 00:20:33.537 07:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.537 07:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.105 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:34.105 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:34.105 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.105 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:34.364 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:34.364 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:34.364 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:34.364 [2024-07-25 07:26:06.897797] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:34.622 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:34.622 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:34.622 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.622 07:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:34.622 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:34.622 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:34.622 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:34.881 [2024-07-25 07:26:07.361063] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:34.881 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:34.881 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:34.881 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.881 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:35.140 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:35.140 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:35.140 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:35.398 [2024-07-25 07:26:07.812052] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:35.398 [2024-07-25 07:26:07.812092] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x95b7b0 name Existed_Raid, state offline 00:20:35.398 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:35.398 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:35.398 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.398 07:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:35.657 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:35.657 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:35.657 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:35.657 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:35.657 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:35.657 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:35.916 BaseBdev2 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:35.916 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:36.175 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:36.434 [ 00:20:36.434 { 00:20:36.434 "name": "BaseBdev2", 00:20:36.434 "aliases": [ 00:20:36.434 "af959a6f-6bd1-47ca-8918-24d9ed30336c" 00:20:36.434 ], 00:20:36.434 "product_name": "Malloc disk", 00:20:36.434 "block_size": 512, 00:20:36.434 "num_blocks": 65536, 00:20:36.434 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:36.434 "assigned_rate_limits": { 00:20:36.434 "rw_ios_per_sec": 0, 00:20:36.434 "rw_mbytes_per_sec": 0, 00:20:36.434 "r_mbytes_per_sec": 0, 00:20:36.434 "w_mbytes_per_sec": 0 00:20:36.434 }, 00:20:36.434 "claimed": false, 00:20:36.434 "zoned": false, 00:20:36.434 "supported_io_types": { 00:20:36.434 "read": true, 00:20:36.434 "write": true, 00:20:36.434 "unmap": true, 00:20:36.434 "flush": true, 00:20:36.434 "reset": true, 00:20:36.434 "nvme_admin": false, 00:20:36.434 "nvme_io": false, 00:20:36.434 "nvme_io_md": false, 00:20:36.434 "write_zeroes": true, 00:20:36.434 "zcopy": true, 00:20:36.434 "get_zone_info": false, 00:20:36.434 "zone_management": false, 00:20:36.434 "zone_append": false, 00:20:36.434 "compare": false, 00:20:36.434 "compare_and_write": false, 00:20:36.434 "abort": true, 00:20:36.434 "seek_hole": false, 00:20:36.434 "seek_data": false, 00:20:36.434 "copy": true, 00:20:36.434 "nvme_iov_md": false 00:20:36.434 }, 00:20:36.434 "memory_domains": [ 00:20:36.434 { 00:20:36.434 "dma_device_id": "system", 00:20:36.434 "dma_device_type": 1 00:20:36.434 }, 00:20:36.434 { 00:20:36.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.434 "dma_device_type": 2 00:20:36.434 } 00:20:36.434 ], 00:20:36.434 "driver_specific": {} 00:20:36.434 } 00:20:36.434 ] 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:36.434 BaseBdev3 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:36.434 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:36.693 07:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:36.693 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:36.951 [ 00:20:36.951 { 00:20:36.951 "name": "BaseBdev3", 00:20:36.951 "aliases": [ 00:20:36.951 "6a2a7dfc-9e01-4efa-bf87-8883e84518e7" 00:20:36.951 ], 00:20:36.951 "product_name": "Malloc disk", 00:20:36.951 "block_size": 512, 00:20:36.951 "num_blocks": 65536, 00:20:36.951 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:36.951 "assigned_rate_limits": { 00:20:36.951 "rw_ios_per_sec": 0, 00:20:36.951 "rw_mbytes_per_sec": 0, 00:20:36.951 "r_mbytes_per_sec": 0, 00:20:36.951 "w_mbytes_per_sec": 0 00:20:36.951 }, 00:20:36.952 "claimed": false, 00:20:36.952 "zoned": false, 00:20:36.952 "supported_io_types": { 00:20:36.952 "read": true, 00:20:36.952 "write": true, 00:20:36.952 "unmap": true, 00:20:36.952 "flush": true, 00:20:36.952 "reset": true, 00:20:36.952 "nvme_admin": false, 00:20:36.952 "nvme_io": false, 00:20:36.952 "nvme_io_md": false, 00:20:36.952 "write_zeroes": true, 00:20:36.952 "zcopy": true, 00:20:36.952 "get_zone_info": false, 00:20:36.952 "zone_management": false, 00:20:36.952 "zone_append": false, 00:20:36.952 "compare": false, 00:20:36.952 "compare_and_write": false, 00:20:36.952 "abort": true, 00:20:36.952 "seek_hole": false, 00:20:36.952 "seek_data": false, 00:20:36.952 "copy": true, 00:20:36.952 "nvme_iov_md": false 00:20:36.952 }, 00:20:36.952 "memory_domains": [ 00:20:36.952 { 00:20:36.952 "dma_device_id": "system", 00:20:36.952 "dma_device_type": 1 00:20:36.952 }, 00:20:36.952 { 00:20:36.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.952 "dma_device_type": 2 00:20:36.952 } 00:20:36.952 ], 00:20:36.952 "driver_specific": {} 00:20:36.952 } 00:20:36.952 ] 00:20:36.952 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:36.952 07:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:36.952 07:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:36.952 07:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:37.211 BaseBdev4 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:37.211 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:37.469 07:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:37.728 [ 00:20:37.728 { 00:20:37.728 "name": "BaseBdev4", 00:20:37.728 "aliases": [ 00:20:37.728 "c970d8c3-3f2e-486d-b0c5-544f069b0e4b" 00:20:37.728 ], 00:20:37.728 "product_name": "Malloc disk", 00:20:37.728 "block_size": 512, 00:20:37.729 "num_blocks": 65536, 00:20:37.729 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:37.729 "assigned_rate_limits": { 00:20:37.729 "rw_ios_per_sec": 0, 00:20:37.729 "rw_mbytes_per_sec": 0, 00:20:37.729 "r_mbytes_per_sec": 0, 00:20:37.729 "w_mbytes_per_sec": 0 00:20:37.729 }, 00:20:37.729 "claimed": false, 00:20:37.729 "zoned": false, 00:20:37.729 "supported_io_types": { 00:20:37.729 "read": true, 00:20:37.729 "write": true, 00:20:37.729 "unmap": true, 00:20:37.729 "flush": true, 00:20:37.729 "reset": true, 00:20:37.729 "nvme_admin": false, 00:20:37.729 "nvme_io": false, 00:20:37.729 "nvme_io_md": false, 00:20:37.729 "write_zeroes": true, 00:20:37.729 "zcopy": true, 00:20:37.729 "get_zone_info": false, 00:20:37.729 "zone_management": false, 00:20:37.729 "zone_append": false, 00:20:37.729 "compare": false, 00:20:37.729 "compare_and_write": false, 00:20:37.729 "abort": true, 00:20:37.729 "seek_hole": false, 00:20:37.729 "seek_data": false, 00:20:37.729 "copy": true, 00:20:37.729 "nvme_iov_md": false 00:20:37.729 }, 00:20:37.729 "memory_domains": [ 00:20:37.729 { 00:20:37.729 "dma_device_id": "system", 00:20:37.729 "dma_device_type": 1 00:20:37.729 }, 00:20:37.729 { 00:20:37.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.729 "dma_device_type": 2 00:20:37.729 } 00:20:37.729 ], 00:20:37.729 "driver_specific": {} 00:20:37.729 } 00:20:37.729 ] 00:20:37.729 07:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:37.729 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:37.729 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:37.729 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:37.987 [2024-07-25 07:26:10.321294] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:37.987 [2024-07-25 07:26:10.321334] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:37.987 [2024-07-25 07:26:10.321352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:37.987 [2024-07-25 07:26:10.322587] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:37.987 [2024-07-25 07:26:10.322626] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:37.987 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.988 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:38.247 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.247 "name": "Existed_Raid", 00:20:38.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.247 "strip_size_kb": 64, 00:20:38.247 "state": "configuring", 00:20:38.247 "raid_level": "concat", 00:20:38.247 "superblock": false, 00:20:38.247 "num_base_bdevs": 4, 00:20:38.247 "num_base_bdevs_discovered": 3, 00:20:38.247 "num_base_bdevs_operational": 4, 00:20:38.247 "base_bdevs_list": [ 00:20:38.247 { 00:20:38.247 "name": "BaseBdev1", 00:20:38.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.247 "is_configured": false, 00:20:38.247 "data_offset": 0, 00:20:38.247 "data_size": 0 00:20:38.247 }, 00:20:38.247 { 00:20:38.247 "name": "BaseBdev2", 00:20:38.247 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:38.247 "is_configured": true, 00:20:38.247 "data_offset": 0, 00:20:38.247 "data_size": 65536 00:20:38.247 }, 00:20:38.247 { 00:20:38.247 "name": "BaseBdev3", 00:20:38.247 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:38.247 "is_configured": true, 00:20:38.247 "data_offset": 0, 00:20:38.247 "data_size": 65536 00:20:38.247 }, 00:20:38.247 { 00:20:38.247 "name": "BaseBdev4", 00:20:38.247 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:38.247 "is_configured": true, 00:20:38.247 "data_offset": 0, 00:20:38.247 "data_size": 65536 00:20:38.247 } 00:20:38.247 ] 00:20:38.247 }' 00:20:38.247 07:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.247 07:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.815 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:39.074 [2024-07-25 07:26:11.364007] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.074 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.334 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.334 "name": "Existed_Raid", 00:20:39.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.334 "strip_size_kb": 64, 00:20:39.334 "state": "configuring", 00:20:39.334 "raid_level": "concat", 00:20:39.334 "superblock": false, 00:20:39.334 "num_base_bdevs": 4, 00:20:39.334 "num_base_bdevs_discovered": 2, 00:20:39.334 "num_base_bdevs_operational": 4, 00:20:39.334 "base_bdevs_list": [ 00:20:39.334 { 00:20:39.334 "name": "BaseBdev1", 00:20:39.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.334 "is_configured": false, 00:20:39.334 "data_offset": 0, 00:20:39.334 "data_size": 0 00:20:39.334 }, 00:20:39.334 { 00:20:39.334 "name": null, 00:20:39.334 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:39.334 "is_configured": false, 00:20:39.334 "data_offset": 0, 00:20:39.334 "data_size": 65536 00:20:39.334 }, 00:20:39.334 { 00:20:39.334 "name": "BaseBdev3", 00:20:39.334 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:39.334 "is_configured": true, 00:20:39.334 "data_offset": 0, 00:20:39.334 "data_size": 65536 00:20:39.334 }, 00:20:39.334 { 00:20:39.334 "name": "BaseBdev4", 00:20:39.334 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:39.334 "is_configured": true, 00:20:39.334 "data_offset": 0, 00:20:39.334 "data_size": 65536 00:20:39.334 } 00:20:39.334 ] 00:20:39.334 }' 00:20:39.334 07:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.334 07:26:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.902 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.902 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:39.902 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:39.902 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:40.161 [2024-07-25 07:26:12.534321] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:40.162 BaseBdev1 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:40.162 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:40.421 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:40.680 [ 00:20:40.680 { 00:20:40.680 "name": "BaseBdev1", 00:20:40.680 "aliases": [ 00:20:40.680 "45293167-ead9-4fcb-816a-7c36b3808489" 00:20:40.680 ], 00:20:40.680 "product_name": "Malloc disk", 00:20:40.680 "block_size": 512, 00:20:40.680 "num_blocks": 65536, 00:20:40.680 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:40.680 "assigned_rate_limits": { 00:20:40.680 "rw_ios_per_sec": 0, 00:20:40.680 "rw_mbytes_per_sec": 0, 00:20:40.680 "r_mbytes_per_sec": 0, 00:20:40.680 "w_mbytes_per_sec": 0 00:20:40.680 }, 00:20:40.680 "claimed": true, 00:20:40.680 "claim_type": "exclusive_write", 00:20:40.680 "zoned": false, 00:20:40.680 "supported_io_types": { 00:20:40.680 "read": true, 00:20:40.680 "write": true, 00:20:40.680 "unmap": true, 00:20:40.680 "flush": true, 00:20:40.680 "reset": true, 00:20:40.680 "nvme_admin": false, 00:20:40.680 "nvme_io": false, 00:20:40.680 "nvme_io_md": false, 00:20:40.680 "write_zeroes": true, 00:20:40.680 "zcopy": true, 00:20:40.680 "get_zone_info": false, 00:20:40.680 "zone_management": false, 00:20:40.680 "zone_append": false, 00:20:40.680 "compare": false, 00:20:40.680 "compare_and_write": false, 00:20:40.680 "abort": true, 00:20:40.680 "seek_hole": false, 00:20:40.680 "seek_data": false, 00:20:40.680 "copy": true, 00:20:40.680 "nvme_iov_md": false 00:20:40.680 }, 00:20:40.680 "memory_domains": [ 00:20:40.680 { 00:20:40.680 "dma_device_id": "system", 00:20:40.680 "dma_device_type": 1 00:20:40.680 }, 00:20:40.680 { 00:20:40.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.680 "dma_device_type": 2 00:20:40.680 } 00:20:40.680 ], 00:20:40.680 "driver_specific": {} 00:20:40.680 } 00:20:40.680 ] 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.680 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.681 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.681 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.681 07:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.681 07:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.681 07:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.940 07:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.940 "name": "Existed_Raid", 00:20:40.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.940 "strip_size_kb": 64, 00:20:40.940 "state": "configuring", 00:20:40.940 "raid_level": "concat", 00:20:40.940 "superblock": false, 00:20:40.940 "num_base_bdevs": 4, 00:20:40.940 "num_base_bdevs_discovered": 3, 00:20:40.940 "num_base_bdevs_operational": 4, 00:20:40.940 "base_bdevs_list": [ 00:20:40.940 { 00:20:40.940 "name": "BaseBdev1", 00:20:40.940 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:40.940 "is_configured": true, 00:20:40.940 "data_offset": 0, 00:20:40.940 "data_size": 65536 00:20:40.940 }, 00:20:40.940 { 00:20:40.940 "name": null, 00:20:40.940 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:40.940 "is_configured": false, 00:20:40.940 "data_offset": 0, 00:20:40.940 "data_size": 65536 00:20:40.940 }, 00:20:40.940 { 00:20:40.940 "name": "BaseBdev3", 00:20:40.940 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:40.940 "is_configured": true, 00:20:40.940 "data_offset": 0, 00:20:40.940 "data_size": 65536 00:20:40.940 }, 00:20:40.940 { 00:20:40.940 "name": "BaseBdev4", 00:20:40.940 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:40.940 "is_configured": true, 00:20:40.940 "data_offset": 0, 00:20:40.940 "data_size": 65536 00:20:40.940 } 00:20:40.940 ] 00:20:40.940 }' 00:20:40.940 07:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.940 07:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:41.507 07:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.507 07:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:41.507 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:41.508 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:41.766 [2024-07-25 07:26:14.218787] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.766 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:42.025 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.025 "name": "Existed_Raid", 00:20:42.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.025 "strip_size_kb": 64, 00:20:42.025 "state": "configuring", 00:20:42.025 "raid_level": "concat", 00:20:42.025 "superblock": false, 00:20:42.025 "num_base_bdevs": 4, 00:20:42.025 "num_base_bdevs_discovered": 2, 00:20:42.025 "num_base_bdevs_operational": 4, 00:20:42.025 "base_bdevs_list": [ 00:20:42.025 { 00:20:42.025 "name": "BaseBdev1", 00:20:42.025 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:42.025 "is_configured": true, 00:20:42.025 "data_offset": 0, 00:20:42.025 "data_size": 65536 00:20:42.025 }, 00:20:42.025 { 00:20:42.025 "name": null, 00:20:42.025 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:42.025 "is_configured": false, 00:20:42.025 "data_offset": 0, 00:20:42.025 "data_size": 65536 00:20:42.025 }, 00:20:42.025 { 00:20:42.025 "name": null, 00:20:42.025 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:42.025 "is_configured": false, 00:20:42.025 "data_offset": 0, 00:20:42.025 "data_size": 65536 00:20:42.025 }, 00:20:42.025 { 00:20:42.025 "name": "BaseBdev4", 00:20:42.025 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:42.025 "is_configured": true, 00:20:42.025 "data_offset": 0, 00:20:42.025 "data_size": 65536 00:20:42.025 } 00:20:42.025 ] 00:20:42.025 }' 00:20:42.025 07:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.025 07:26:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.594 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.594 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:42.853 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:42.853 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:43.112 [2024-07-25 07:26:15.462214] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:43.112 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:43.112 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:43.112 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:43.112 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:43.112 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.113 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:43.371 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:43.371 "name": "Existed_Raid", 00:20:43.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.371 "strip_size_kb": 64, 00:20:43.371 "state": "configuring", 00:20:43.371 "raid_level": "concat", 00:20:43.371 "superblock": false, 00:20:43.371 "num_base_bdevs": 4, 00:20:43.371 "num_base_bdevs_discovered": 3, 00:20:43.371 "num_base_bdevs_operational": 4, 00:20:43.371 "base_bdevs_list": [ 00:20:43.371 { 00:20:43.371 "name": "BaseBdev1", 00:20:43.371 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:43.371 "is_configured": true, 00:20:43.371 "data_offset": 0, 00:20:43.371 "data_size": 65536 00:20:43.371 }, 00:20:43.371 { 00:20:43.371 "name": null, 00:20:43.371 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:43.371 "is_configured": false, 00:20:43.371 "data_offset": 0, 00:20:43.371 "data_size": 65536 00:20:43.371 }, 00:20:43.371 { 00:20:43.371 "name": "BaseBdev3", 00:20:43.371 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:43.371 "is_configured": true, 00:20:43.371 "data_offset": 0, 00:20:43.371 "data_size": 65536 00:20:43.371 }, 00:20:43.371 { 00:20:43.371 "name": "BaseBdev4", 00:20:43.371 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:43.371 "is_configured": true, 00:20:43.371 "data_offset": 0, 00:20:43.371 "data_size": 65536 00:20:43.371 } 00:20:43.371 ] 00:20:43.371 }' 00:20:43.371 07:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:43.371 07:26:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.939 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:43.939 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.198 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:44.199 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:44.199 [2024-07-25 07:26:16.717512] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.458 "name": "Existed_Raid", 00:20:44.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.458 "strip_size_kb": 64, 00:20:44.458 "state": "configuring", 00:20:44.458 "raid_level": "concat", 00:20:44.458 "superblock": false, 00:20:44.458 "num_base_bdevs": 4, 00:20:44.458 "num_base_bdevs_discovered": 2, 00:20:44.458 "num_base_bdevs_operational": 4, 00:20:44.458 "base_bdevs_list": [ 00:20:44.458 { 00:20:44.458 "name": null, 00:20:44.458 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:44.458 "is_configured": false, 00:20:44.458 "data_offset": 0, 00:20:44.458 "data_size": 65536 00:20:44.458 }, 00:20:44.458 { 00:20:44.458 "name": null, 00:20:44.458 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:44.458 "is_configured": false, 00:20:44.458 "data_offset": 0, 00:20:44.458 "data_size": 65536 00:20:44.458 }, 00:20:44.458 { 00:20:44.458 "name": "BaseBdev3", 00:20:44.458 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:44.458 "is_configured": true, 00:20:44.458 "data_offset": 0, 00:20:44.458 "data_size": 65536 00:20:44.458 }, 00:20:44.458 { 00:20:44.458 "name": "BaseBdev4", 00:20:44.458 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:44.458 "is_configured": true, 00:20:44.458 "data_offset": 0, 00:20:44.458 "data_size": 65536 00:20:44.458 } 00:20:44.458 ] 00:20:44.458 }' 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.458 07:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.027 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:45.027 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.286 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:45.286 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:45.545 [2024-07-25 07:26:17.970968] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.545 07:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.803 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.803 "name": "Existed_Raid", 00:20:45.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.803 "strip_size_kb": 64, 00:20:45.803 "state": "configuring", 00:20:45.803 "raid_level": "concat", 00:20:45.803 "superblock": false, 00:20:45.803 "num_base_bdevs": 4, 00:20:45.803 "num_base_bdevs_discovered": 3, 00:20:45.803 "num_base_bdevs_operational": 4, 00:20:45.803 "base_bdevs_list": [ 00:20:45.803 { 00:20:45.803 "name": null, 00:20:45.803 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:45.803 "is_configured": false, 00:20:45.803 "data_offset": 0, 00:20:45.803 "data_size": 65536 00:20:45.803 }, 00:20:45.803 { 00:20:45.803 "name": "BaseBdev2", 00:20:45.803 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:45.803 "is_configured": true, 00:20:45.803 "data_offset": 0, 00:20:45.803 "data_size": 65536 00:20:45.803 }, 00:20:45.803 { 00:20:45.803 "name": "BaseBdev3", 00:20:45.803 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:45.803 "is_configured": true, 00:20:45.803 "data_offset": 0, 00:20:45.803 "data_size": 65536 00:20:45.803 }, 00:20:45.803 { 00:20:45.803 "name": "BaseBdev4", 00:20:45.803 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:45.803 "is_configured": true, 00:20:45.803 "data_offset": 0, 00:20:45.803 "data_size": 65536 00:20:45.803 } 00:20:45.803 ] 00:20:45.803 }' 00:20:45.803 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.803 07:26:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:46.371 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.371 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:46.665 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:46.665 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.665 07:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:46.665 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 45293167-ead9-4fcb-816a-7c36b3808489 00:20:46.924 [2024-07-25 07:26:19.397789] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:46.924 [2024-07-25 07:26:19.397825] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x952a30 00:20:46.924 [2024-07-25 07:26:19.397833] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:46.924 [2024-07-25 07:26:19.398011] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x952ef0 00:20:46.924 [2024-07-25 07:26:19.398117] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x952a30 00:20:46.924 [2024-07-25 07:26:19.398126] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x952a30 00:20:46.924 [2024-07-25 07:26:19.398287] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:46.924 NewBaseBdev 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:46.924 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:47.183 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:47.442 [ 00:20:47.442 { 00:20:47.442 "name": "NewBaseBdev", 00:20:47.442 "aliases": [ 00:20:47.442 "45293167-ead9-4fcb-816a-7c36b3808489" 00:20:47.442 ], 00:20:47.442 "product_name": "Malloc disk", 00:20:47.442 "block_size": 512, 00:20:47.442 "num_blocks": 65536, 00:20:47.442 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:47.442 "assigned_rate_limits": { 00:20:47.442 "rw_ios_per_sec": 0, 00:20:47.442 "rw_mbytes_per_sec": 0, 00:20:47.442 "r_mbytes_per_sec": 0, 00:20:47.442 "w_mbytes_per_sec": 0 00:20:47.442 }, 00:20:47.442 "claimed": true, 00:20:47.442 "claim_type": "exclusive_write", 00:20:47.442 "zoned": false, 00:20:47.442 "supported_io_types": { 00:20:47.442 "read": true, 00:20:47.442 "write": true, 00:20:47.442 "unmap": true, 00:20:47.442 "flush": true, 00:20:47.442 "reset": true, 00:20:47.442 "nvme_admin": false, 00:20:47.442 "nvme_io": false, 00:20:47.442 "nvme_io_md": false, 00:20:47.442 "write_zeroes": true, 00:20:47.442 "zcopy": true, 00:20:47.442 "get_zone_info": false, 00:20:47.442 "zone_management": false, 00:20:47.442 "zone_append": false, 00:20:47.442 "compare": false, 00:20:47.442 "compare_and_write": false, 00:20:47.442 "abort": true, 00:20:47.442 "seek_hole": false, 00:20:47.442 "seek_data": false, 00:20:47.442 "copy": true, 00:20:47.442 "nvme_iov_md": false 00:20:47.442 }, 00:20:47.442 "memory_domains": [ 00:20:47.442 { 00:20:47.442 "dma_device_id": "system", 00:20:47.442 "dma_device_type": 1 00:20:47.442 }, 00:20:47.442 { 00:20:47.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.442 "dma_device_type": 2 00:20:47.442 } 00:20:47.442 ], 00:20:47.442 "driver_specific": {} 00:20:47.442 } 00:20:47.442 ] 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.442 07:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.701 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.701 "name": "Existed_Raid", 00:20:47.701 "uuid": "e837aeb4-88ed-4d82-8ea4-db6b561ff166", 00:20:47.701 "strip_size_kb": 64, 00:20:47.701 "state": "online", 00:20:47.701 "raid_level": "concat", 00:20:47.701 "superblock": false, 00:20:47.701 "num_base_bdevs": 4, 00:20:47.701 "num_base_bdevs_discovered": 4, 00:20:47.701 "num_base_bdevs_operational": 4, 00:20:47.701 "base_bdevs_list": [ 00:20:47.701 { 00:20:47.701 "name": "NewBaseBdev", 00:20:47.701 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:47.701 "is_configured": true, 00:20:47.701 "data_offset": 0, 00:20:47.701 "data_size": 65536 00:20:47.701 }, 00:20:47.701 { 00:20:47.701 "name": "BaseBdev2", 00:20:47.701 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:47.701 "is_configured": true, 00:20:47.701 "data_offset": 0, 00:20:47.701 "data_size": 65536 00:20:47.701 }, 00:20:47.701 { 00:20:47.701 "name": "BaseBdev3", 00:20:47.701 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:47.701 "is_configured": true, 00:20:47.701 "data_offset": 0, 00:20:47.701 "data_size": 65536 00:20:47.701 }, 00:20:47.701 { 00:20:47.701 "name": "BaseBdev4", 00:20:47.701 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:47.701 "is_configured": true, 00:20:47.701 "data_offset": 0, 00:20:47.701 "data_size": 65536 00:20:47.701 } 00:20:47.701 ] 00:20:47.701 }' 00:20:47.701 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.701 07:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:48.268 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:48.527 [2024-07-25 07:26:20.825860] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:48.527 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:48.527 "name": "Existed_Raid", 00:20:48.527 "aliases": [ 00:20:48.527 "e837aeb4-88ed-4d82-8ea4-db6b561ff166" 00:20:48.527 ], 00:20:48.527 "product_name": "Raid Volume", 00:20:48.527 "block_size": 512, 00:20:48.527 "num_blocks": 262144, 00:20:48.527 "uuid": "e837aeb4-88ed-4d82-8ea4-db6b561ff166", 00:20:48.527 "assigned_rate_limits": { 00:20:48.527 "rw_ios_per_sec": 0, 00:20:48.527 "rw_mbytes_per_sec": 0, 00:20:48.527 "r_mbytes_per_sec": 0, 00:20:48.527 "w_mbytes_per_sec": 0 00:20:48.527 }, 00:20:48.527 "claimed": false, 00:20:48.527 "zoned": false, 00:20:48.527 "supported_io_types": { 00:20:48.527 "read": true, 00:20:48.527 "write": true, 00:20:48.527 "unmap": true, 00:20:48.527 "flush": true, 00:20:48.527 "reset": true, 00:20:48.527 "nvme_admin": false, 00:20:48.527 "nvme_io": false, 00:20:48.527 "nvme_io_md": false, 00:20:48.527 "write_zeroes": true, 00:20:48.527 "zcopy": false, 00:20:48.527 "get_zone_info": false, 00:20:48.527 "zone_management": false, 00:20:48.527 "zone_append": false, 00:20:48.527 "compare": false, 00:20:48.527 "compare_and_write": false, 00:20:48.527 "abort": false, 00:20:48.527 "seek_hole": false, 00:20:48.527 "seek_data": false, 00:20:48.527 "copy": false, 00:20:48.527 "nvme_iov_md": false 00:20:48.527 }, 00:20:48.527 "memory_domains": [ 00:20:48.527 { 00:20:48.527 "dma_device_id": "system", 00:20:48.527 "dma_device_type": 1 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.527 "dma_device_type": 2 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "system", 00:20:48.527 "dma_device_type": 1 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.527 "dma_device_type": 2 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "system", 00:20:48.527 "dma_device_type": 1 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.527 "dma_device_type": 2 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "system", 00:20:48.527 "dma_device_type": 1 00:20:48.527 }, 00:20:48.527 { 00:20:48.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.527 "dma_device_type": 2 00:20:48.527 } 00:20:48.527 ], 00:20:48.527 "driver_specific": { 00:20:48.527 "raid": { 00:20:48.527 "uuid": "e837aeb4-88ed-4d82-8ea4-db6b561ff166", 00:20:48.527 "strip_size_kb": 64, 00:20:48.527 "state": "online", 00:20:48.527 "raid_level": "concat", 00:20:48.527 "superblock": false, 00:20:48.527 "num_base_bdevs": 4, 00:20:48.527 "num_base_bdevs_discovered": 4, 00:20:48.527 "num_base_bdevs_operational": 4, 00:20:48.527 "base_bdevs_list": [ 00:20:48.527 { 00:20:48.527 "name": "NewBaseBdev", 00:20:48.527 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:48.528 "is_configured": true, 00:20:48.528 "data_offset": 0, 00:20:48.528 "data_size": 65536 00:20:48.528 }, 00:20:48.528 { 00:20:48.528 "name": "BaseBdev2", 00:20:48.528 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:48.528 "is_configured": true, 00:20:48.528 "data_offset": 0, 00:20:48.528 "data_size": 65536 00:20:48.528 }, 00:20:48.528 { 00:20:48.528 "name": "BaseBdev3", 00:20:48.528 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:48.528 "is_configured": true, 00:20:48.528 "data_offset": 0, 00:20:48.528 "data_size": 65536 00:20:48.528 }, 00:20:48.528 { 00:20:48.528 "name": "BaseBdev4", 00:20:48.528 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:48.528 "is_configured": true, 00:20:48.528 "data_offset": 0, 00:20:48.528 "data_size": 65536 00:20:48.528 } 00:20:48.528 ] 00:20:48.528 } 00:20:48.528 } 00:20:48.528 }' 00:20:48.528 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:48.528 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:48.528 BaseBdev2 00:20:48.528 BaseBdev3 00:20:48.528 BaseBdev4' 00:20:48.528 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:48.528 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:48.528 07:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:48.787 "name": "NewBaseBdev", 00:20:48.787 "aliases": [ 00:20:48.787 "45293167-ead9-4fcb-816a-7c36b3808489" 00:20:48.787 ], 00:20:48.787 "product_name": "Malloc disk", 00:20:48.787 "block_size": 512, 00:20:48.787 "num_blocks": 65536, 00:20:48.787 "uuid": "45293167-ead9-4fcb-816a-7c36b3808489", 00:20:48.787 "assigned_rate_limits": { 00:20:48.787 "rw_ios_per_sec": 0, 00:20:48.787 "rw_mbytes_per_sec": 0, 00:20:48.787 "r_mbytes_per_sec": 0, 00:20:48.787 "w_mbytes_per_sec": 0 00:20:48.787 }, 00:20:48.787 "claimed": true, 00:20:48.787 "claim_type": "exclusive_write", 00:20:48.787 "zoned": false, 00:20:48.787 "supported_io_types": { 00:20:48.787 "read": true, 00:20:48.787 "write": true, 00:20:48.787 "unmap": true, 00:20:48.787 "flush": true, 00:20:48.787 "reset": true, 00:20:48.787 "nvme_admin": false, 00:20:48.787 "nvme_io": false, 00:20:48.787 "nvme_io_md": false, 00:20:48.787 "write_zeroes": true, 00:20:48.787 "zcopy": true, 00:20:48.787 "get_zone_info": false, 00:20:48.787 "zone_management": false, 00:20:48.787 "zone_append": false, 00:20:48.787 "compare": false, 00:20:48.787 "compare_and_write": false, 00:20:48.787 "abort": true, 00:20:48.787 "seek_hole": false, 00:20:48.787 "seek_data": false, 00:20:48.787 "copy": true, 00:20:48.787 "nvme_iov_md": false 00:20:48.787 }, 00:20:48.787 "memory_domains": [ 00:20:48.787 { 00:20:48.787 "dma_device_id": "system", 00:20:48.787 "dma_device_type": 1 00:20:48.787 }, 00:20:48.787 { 00:20:48.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.787 "dma_device_type": 2 00:20:48.787 } 00:20:48.787 ], 00:20:48.787 "driver_specific": {} 00:20:48.787 }' 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:48.787 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:49.046 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:49.305 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:49.305 "name": "BaseBdev2", 00:20:49.305 "aliases": [ 00:20:49.305 "af959a6f-6bd1-47ca-8918-24d9ed30336c" 00:20:49.305 ], 00:20:49.305 "product_name": "Malloc disk", 00:20:49.305 "block_size": 512, 00:20:49.305 "num_blocks": 65536, 00:20:49.305 "uuid": "af959a6f-6bd1-47ca-8918-24d9ed30336c", 00:20:49.305 "assigned_rate_limits": { 00:20:49.305 "rw_ios_per_sec": 0, 00:20:49.305 "rw_mbytes_per_sec": 0, 00:20:49.305 "r_mbytes_per_sec": 0, 00:20:49.305 "w_mbytes_per_sec": 0 00:20:49.305 }, 00:20:49.305 "claimed": true, 00:20:49.305 "claim_type": "exclusive_write", 00:20:49.305 "zoned": false, 00:20:49.305 "supported_io_types": { 00:20:49.305 "read": true, 00:20:49.305 "write": true, 00:20:49.305 "unmap": true, 00:20:49.305 "flush": true, 00:20:49.305 "reset": true, 00:20:49.305 "nvme_admin": false, 00:20:49.305 "nvme_io": false, 00:20:49.305 "nvme_io_md": false, 00:20:49.305 "write_zeroes": true, 00:20:49.305 "zcopy": true, 00:20:49.305 "get_zone_info": false, 00:20:49.305 "zone_management": false, 00:20:49.305 "zone_append": false, 00:20:49.305 "compare": false, 00:20:49.305 "compare_and_write": false, 00:20:49.305 "abort": true, 00:20:49.305 "seek_hole": false, 00:20:49.305 "seek_data": false, 00:20:49.305 "copy": true, 00:20:49.305 "nvme_iov_md": false 00:20:49.305 }, 00:20:49.305 "memory_domains": [ 00:20:49.305 { 00:20:49.305 "dma_device_id": "system", 00:20:49.305 "dma_device_type": 1 00:20:49.305 }, 00:20:49.305 { 00:20:49.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.305 "dma_device_type": 2 00:20:49.305 } 00:20:49.305 ], 00:20:49.305 "driver_specific": {} 00:20:49.305 }' 00:20:49.305 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.305 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.305 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:49.305 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.305 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:49.563 07:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:49.822 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:49.822 "name": "BaseBdev3", 00:20:49.822 "aliases": [ 00:20:49.822 "6a2a7dfc-9e01-4efa-bf87-8883e84518e7" 00:20:49.822 ], 00:20:49.822 "product_name": "Malloc disk", 00:20:49.822 "block_size": 512, 00:20:49.822 "num_blocks": 65536, 00:20:49.822 "uuid": "6a2a7dfc-9e01-4efa-bf87-8883e84518e7", 00:20:49.822 "assigned_rate_limits": { 00:20:49.822 "rw_ios_per_sec": 0, 00:20:49.822 "rw_mbytes_per_sec": 0, 00:20:49.822 "r_mbytes_per_sec": 0, 00:20:49.822 "w_mbytes_per_sec": 0 00:20:49.822 }, 00:20:49.822 "claimed": true, 00:20:49.823 "claim_type": "exclusive_write", 00:20:49.823 "zoned": false, 00:20:49.823 "supported_io_types": { 00:20:49.823 "read": true, 00:20:49.823 "write": true, 00:20:49.823 "unmap": true, 00:20:49.823 "flush": true, 00:20:49.823 "reset": true, 00:20:49.823 "nvme_admin": false, 00:20:49.823 "nvme_io": false, 00:20:49.823 "nvme_io_md": false, 00:20:49.823 "write_zeroes": true, 00:20:49.823 "zcopy": true, 00:20:49.823 "get_zone_info": false, 00:20:49.823 "zone_management": false, 00:20:49.823 "zone_append": false, 00:20:49.823 "compare": false, 00:20:49.823 "compare_and_write": false, 00:20:49.823 "abort": true, 00:20:49.823 "seek_hole": false, 00:20:49.823 "seek_data": false, 00:20:49.823 "copy": true, 00:20:49.823 "nvme_iov_md": false 00:20:49.823 }, 00:20:49.823 "memory_domains": [ 00:20:49.823 { 00:20:49.823 "dma_device_id": "system", 00:20:49.823 "dma_device_type": 1 00:20:49.823 }, 00:20:49.823 { 00:20:49.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.823 "dma_device_type": 2 00:20:49.823 } 00:20:49.823 ], 00:20:49.823 "driver_specific": {} 00:20:49.823 }' 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:49.823 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:50.082 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:50.341 "name": "BaseBdev4", 00:20:50.341 "aliases": [ 00:20:50.341 "c970d8c3-3f2e-486d-b0c5-544f069b0e4b" 00:20:50.341 ], 00:20:50.341 "product_name": "Malloc disk", 00:20:50.341 "block_size": 512, 00:20:50.341 "num_blocks": 65536, 00:20:50.341 "uuid": "c970d8c3-3f2e-486d-b0c5-544f069b0e4b", 00:20:50.341 "assigned_rate_limits": { 00:20:50.341 "rw_ios_per_sec": 0, 00:20:50.341 "rw_mbytes_per_sec": 0, 00:20:50.341 "r_mbytes_per_sec": 0, 00:20:50.341 "w_mbytes_per_sec": 0 00:20:50.341 }, 00:20:50.341 "claimed": true, 00:20:50.341 "claim_type": "exclusive_write", 00:20:50.341 "zoned": false, 00:20:50.341 "supported_io_types": { 00:20:50.341 "read": true, 00:20:50.341 "write": true, 00:20:50.341 "unmap": true, 00:20:50.341 "flush": true, 00:20:50.341 "reset": true, 00:20:50.341 "nvme_admin": false, 00:20:50.341 "nvme_io": false, 00:20:50.341 "nvme_io_md": false, 00:20:50.341 "write_zeroes": true, 00:20:50.341 "zcopy": true, 00:20:50.341 "get_zone_info": false, 00:20:50.341 "zone_management": false, 00:20:50.341 "zone_append": false, 00:20:50.341 "compare": false, 00:20:50.341 "compare_and_write": false, 00:20:50.341 "abort": true, 00:20:50.341 "seek_hole": false, 00:20:50.341 "seek_data": false, 00:20:50.341 "copy": true, 00:20:50.341 "nvme_iov_md": false 00:20:50.341 }, 00:20:50.341 "memory_domains": [ 00:20:50.341 { 00:20:50.341 "dma_device_id": "system", 00:20:50.341 "dma_device_type": 1 00:20:50.341 }, 00:20:50.341 { 00:20:50.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.341 "dma_device_type": 2 00:20:50.341 } 00:20:50.341 ], 00:20:50.341 "driver_specific": {} 00:20:50.341 }' 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.341 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.600 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:50.600 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.600 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.600 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:50.600 07:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:50.859 [2024-07-25 07:26:23.171750] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:50.859 [2024-07-25 07:26:23.171775] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:50.859 [2024-07-25 07:26:23.171827] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:50.859 [2024-07-25 07:26:23.171882] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:50.859 [2024-07-25 07:26:23.171894] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x952a30 name Existed_Raid, state offline 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1677988 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1677988 ']' 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1677988 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1677988 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1677988' 00:20:50.859 killing process with pid 1677988 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1677988 00:20:50.859 [2024-07-25 07:26:23.226904] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:50.859 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1677988 00:20:50.859 [2024-07-25 07:26:23.259295] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:51.119 00:20:51.119 real 0m30.186s 00:20:51.119 user 0m55.315s 00:20:51.119 sys 0m5.459s 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:51.119 ************************************ 00:20:51.119 END TEST raid_state_function_test 00:20:51.119 ************************************ 00:20:51.119 07:26:23 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:51.119 07:26:23 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:51.119 07:26:23 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:51.119 07:26:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:51.119 ************************************ 00:20:51.119 START TEST raid_state_function_test_sb 00:20:51.119 ************************************ 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 true 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1683700 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1683700' 00:20:51.119 Process raid pid: 1683700 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1683700 /var/tmp/spdk-raid.sock 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1683700 ']' 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:51.119 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:51.119 07:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:51.119 [2024-07-25 07:26:23.593058] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:20:51.119 [2024-07-25 07:26:23.593112] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.378 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:51.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.379 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:51.379 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:51.379 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:51.379 [2024-07-25 07:26:23.714995] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:51.379 [2024-07-25 07:26:23.801095] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:20:51.379 [2024-07-25 07:26:23.856056] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:51.379 [2024-07-25 07:26:23.856088] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:52.312 [2024-07-25 07:26:24.709637] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:52.312 [2024-07-25 07:26:24.709673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:52.312 [2024-07-25 07:26:24.709683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:52.312 [2024-07-25 07:26:24.709694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:52.312 [2024-07-25 07:26:24.709702] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:52.312 [2024-07-25 07:26:24.709712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:52.312 [2024-07-25 07:26:24.709724] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:52.312 [2024-07-25 07:26:24.709734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.312 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.571 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.571 "name": "Existed_Raid", 00:20:52.571 "uuid": "a11b9f5c-22bc-4c4a-b961-45639de4c9e9", 00:20:52.571 "strip_size_kb": 64, 00:20:52.571 "state": "configuring", 00:20:52.571 "raid_level": "concat", 00:20:52.571 "superblock": true, 00:20:52.571 "num_base_bdevs": 4, 00:20:52.571 "num_base_bdevs_discovered": 0, 00:20:52.571 "num_base_bdevs_operational": 4, 00:20:52.571 "base_bdevs_list": [ 00:20:52.571 { 00:20:52.571 "name": "BaseBdev1", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.571 "is_configured": false, 00:20:52.571 "data_offset": 0, 00:20:52.571 "data_size": 0 00:20:52.571 }, 00:20:52.571 { 00:20:52.571 "name": "BaseBdev2", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.571 "is_configured": false, 00:20:52.571 "data_offset": 0, 00:20:52.571 "data_size": 0 00:20:52.571 }, 00:20:52.571 { 00:20:52.571 "name": "BaseBdev3", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.571 "is_configured": false, 00:20:52.571 "data_offset": 0, 00:20:52.571 "data_size": 0 00:20:52.571 }, 00:20:52.571 { 00:20:52.571 "name": "BaseBdev4", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.571 "is_configured": false, 00:20:52.571 "data_offset": 0, 00:20:52.571 "data_size": 0 00:20:52.571 } 00:20:52.571 ] 00:20:52.571 }' 00:20:52.571 07:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.571 07:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.139 07:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:53.139 [2024-07-25 07:26:25.664030] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:53.139 [2024-07-25 07:26:25.664057] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a8ee0 name Existed_Raid, state configuring 00:20:53.409 07:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:53.409 [2024-07-25 07:26:25.892666] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:53.409 [2024-07-25 07:26:25.892693] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:53.409 [2024-07-25 07:26:25.892702] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:53.409 [2024-07-25 07:26:25.892713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:53.409 [2024-07-25 07:26:25.892721] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:53.410 [2024-07-25 07:26:25.892735] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:53.410 [2024-07-25 07:26:25.892743] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:53.410 [2024-07-25 07:26:25.892753] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:53.410 07:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:53.668 [2024-07-25 07:26:26.134710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:53.668 BaseBdev1 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:53.668 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:53.927 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:54.186 [ 00:20:54.186 { 00:20:54.186 "name": "BaseBdev1", 00:20:54.186 "aliases": [ 00:20:54.186 "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78" 00:20:54.186 ], 00:20:54.186 "product_name": "Malloc disk", 00:20:54.186 "block_size": 512, 00:20:54.186 "num_blocks": 65536, 00:20:54.186 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:20:54.186 "assigned_rate_limits": { 00:20:54.186 "rw_ios_per_sec": 0, 00:20:54.186 "rw_mbytes_per_sec": 0, 00:20:54.186 "r_mbytes_per_sec": 0, 00:20:54.186 "w_mbytes_per_sec": 0 00:20:54.186 }, 00:20:54.186 "claimed": true, 00:20:54.186 "claim_type": "exclusive_write", 00:20:54.186 "zoned": false, 00:20:54.186 "supported_io_types": { 00:20:54.186 "read": true, 00:20:54.186 "write": true, 00:20:54.186 "unmap": true, 00:20:54.186 "flush": true, 00:20:54.186 "reset": true, 00:20:54.186 "nvme_admin": false, 00:20:54.186 "nvme_io": false, 00:20:54.186 "nvme_io_md": false, 00:20:54.186 "write_zeroes": true, 00:20:54.186 "zcopy": true, 00:20:54.186 "get_zone_info": false, 00:20:54.186 "zone_management": false, 00:20:54.186 "zone_append": false, 00:20:54.186 "compare": false, 00:20:54.186 "compare_and_write": false, 00:20:54.186 "abort": true, 00:20:54.186 "seek_hole": false, 00:20:54.186 "seek_data": false, 00:20:54.186 "copy": true, 00:20:54.186 "nvme_iov_md": false 00:20:54.186 }, 00:20:54.186 "memory_domains": [ 00:20:54.186 { 00:20:54.186 "dma_device_id": "system", 00:20:54.186 "dma_device_type": 1 00:20:54.186 }, 00:20:54.186 { 00:20:54.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.186 "dma_device_type": 2 00:20:54.186 } 00:20:54.186 ], 00:20:54.186 "driver_specific": {} 00:20:54.186 } 00:20:54.186 ] 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.186 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.445 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.445 "name": "Existed_Raid", 00:20:54.445 "uuid": "1f766be1-f879-4179-ace9-1fe7bdc950a2", 00:20:54.445 "strip_size_kb": 64, 00:20:54.445 "state": "configuring", 00:20:54.445 "raid_level": "concat", 00:20:54.445 "superblock": true, 00:20:54.445 "num_base_bdevs": 4, 00:20:54.445 "num_base_bdevs_discovered": 1, 00:20:54.445 "num_base_bdevs_operational": 4, 00:20:54.445 "base_bdevs_list": [ 00:20:54.445 { 00:20:54.445 "name": "BaseBdev1", 00:20:54.445 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:20:54.445 "is_configured": true, 00:20:54.445 "data_offset": 2048, 00:20:54.445 "data_size": 63488 00:20:54.445 }, 00:20:54.445 { 00:20:54.445 "name": "BaseBdev2", 00:20:54.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.445 "is_configured": false, 00:20:54.445 "data_offset": 0, 00:20:54.445 "data_size": 0 00:20:54.445 }, 00:20:54.445 { 00:20:54.445 "name": "BaseBdev3", 00:20:54.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.445 "is_configured": false, 00:20:54.445 "data_offset": 0, 00:20:54.445 "data_size": 0 00:20:54.445 }, 00:20:54.445 { 00:20:54.445 "name": "BaseBdev4", 00:20:54.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.445 "is_configured": false, 00:20:54.445 "data_offset": 0, 00:20:54.445 "data_size": 0 00:20:54.445 } 00:20:54.445 ] 00:20:54.445 }' 00:20:54.445 07:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.445 07:26:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:55.013 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:55.272 [2024-07-25 07:26:27.602629] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:55.272 [2024-07-25 07:26:27.602662] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a8750 name Existed_Raid, state configuring 00:20:55.272 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:55.531 [2024-07-25 07:26:27.831272] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:55.531 [2024-07-25 07:26:27.832657] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:55.531 [2024-07-25 07:26:27.832688] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:55.531 [2024-07-25 07:26:27.832698] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:55.531 [2024-07-25 07:26:27.832709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:55.531 [2024-07-25 07:26:27.832717] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:55.531 [2024-07-25 07:26:27.832727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.531 07:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.790 07:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.790 "name": "Existed_Raid", 00:20:55.790 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:20:55.790 "strip_size_kb": 64, 00:20:55.790 "state": "configuring", 00:20:55.790 "raid_level": "concat", 00:20:55.790 "superblock": true, 00:20:55.790 "num_base_bdevs": 4, 00:20:55.790 "num_base_bdevs_discovered": 1, 00:20:55.790 "num_base_bdevs_operational": 4, 00:20:55.790 "base_bdevs_list": [ 00:20:55.790 { 00:20:55.790 "name": "BaseBdev1", 00:20:55.790 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:20:55.790 "is_configured": true, 00:20:55.790 "data_offset": 2048, 00:20:55.790 "data_size": 63488 00:20:55.790 }, 00:20:55.790 { 00:20:55.790 "name": "BaseBdev2", 00:20:55.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.790 "is_configured": false, 00:20:55.790 "data_offset": 0, 00:20:55.790 "data_size": 0 00:20:55.790 }, 00:20:55.790 { 00:20:55.790 "name": "BaseBdev3", 00:20:55.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.790 "is_configured": false, 00:20:55.790 "data_offset": 0, 00:20:55.790 "data_size": 0 00:20:55.790 }, 00:20:55.790 { 00:20:55.790 "name": "BaseBdev4", 00:20:55.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.790 "is_configured": false, 00:20:55.790 "data_offset": 0, 00:20:55.790 "data_size": 0 00:20:55.790 } 00:20:55.790 ] 00:20:55.790 }' 00:20:55.790 07:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.790 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:56.358 [2024-07-25 07:26:28.833048] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:56.358 BaseBdev2 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:56.358 07:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.617 07:26:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:56.876 [ 00:20:56.876 { 00:20:56.876 "name": "BaseBdev2", 00:20:56.876 "aliases": [ 00:20:56.876 "7ba32a41-a764-41a1-a7d2-c08f3442c687" 00:20:56.876 ], 00:20:56.876 "product_name": "Malloc disk", 00:20:56.876 "block_size": 512, 00:20:56.876 "num_blocks": 65536, 00:20:56.876 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:20:56.876 "assigned_rate_limits": { 00:20:56.876 "rw_ios_per_sec": 0, 00:20:56.876 "rw_mbytes_per_sec": 0, 00:20:56.876 "r_mbytes_per_sec": 0, 00:20:56.876 "w_mbytes_per_sec": 0 00:20:56.876 }, 00:20:56.876 "claimed": true, 00:20:56.876 "claim_type": "exclusive_write", 00:20:56.876 "zoned": false, 00:20:56.876 "supported_io_types": { 00:20:56.876 "read": true, 00:20:56.876 "write": true, 00:20:56.876 "unmap": true, 00:20:56.876 "flush": true, 00:20:56.876 "reset": true, 00:20:56.876 "nvme_admin": false, 00:20:56.876 "nvme_io": false, 00:20:56.876 "nvme_io_md": false, 00:20:56.876 "write_zeroes": true, 00:20:56.876 "zcopy": true, 00:20:56.876 "get_zone_info": false, 00:20:56.876 "zone_management": false, 00:20:56.876 "zone_append": false, 00:20:56.876 "compare": false, 00:20:56.876 "compare_and_write": false, 00:20:56.876 "abort": true, 00:20:56.876 "seek_hole": false, 00:20:56.876 "seek_data": false, 00:20:56.876 "copy": true, 00:20:56.876 "nvme_iov_md": false 00:20:56.876 }, 00:20:56.876 "memory_domains": [ 00:20:56.876 { 00:20:56.876 "dma_device_id": "system", 00:20:56.876 "dma_device_type": 1 00:20:56.876 }, 00:20:56.876 { 00:20:56.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.876 "dma_device_type": 2 00:20:56.876 } 00:20:56.876 ], 00:20:56.876 "driver_specific": {} 00:20:56.876 } 00:20:56.876 ] 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.876 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.135 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.135 "name": "Existed_Raid", 00:20:57.135 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:20:57.135 "strip_size_kb": 64, 00:20:57.135 "state": "configuring", 00:20:57.135 "raid_level": "concat", 00:20:57.135 "superblock": true, 00:20:57.135 "num_base_bdevs": 4, 00:20:57.135 "num_base_bdevs_discovered": 2, 00:20:57.135 "num_base_bdevs_operational": 4, 00:20:57.135 "base_bdevs_list": [ 00:20:57.135 { 00:20:57.135 "name": "BaseBdev1", 00:20:57.135 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:20:57.135 "is_configured": true, 00:20:57.135 "data_offset": 2048, 00:20:57.135 "data_size": 63488 00:20:57.135 }, 00:20:57.135 { 00:20:57.135 "name": "BaseBdev2", 00:20:57.135 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:20:57.135 "is_configured": true, 00:20:57.135 "data_offset": 2048, 00:20:57.135 "data_size": 63488 00:20:57.135 }, 00:20:57.135 { 00:20:57.135 "name": "BaseBdev3", 00:20:57.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.135 "is_configured": false, 00:20:57.135 "data_offset": 0, 00:20:57.135 "data_size": 0 00:20:57.135 }, 00:20:57.135 { 00:20:57.135 "name": "BaseBdev4", 00:20:57.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.135 "is_configured": false, 00:20:57.135 "data_offset": 0, 00:20:57.135 "data_size": 0 00:20:57.135 } 00:20:57.135 ] 00:20:57.135 }' 00:20:57.135 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.135 07:26:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.393 07:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:57.652 [2024-07-25 07:26:30.107570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:57.652 BaseBdev3 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:57.652 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:57.910 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:58.169 [ 00:20:58.169 { 00:20:58.169 "name": "BaseBdev3", 00:20:58.169 "aliases": [ 00:20:58.169 "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5" 00:20:58.169 ], 00:20:58.169 "product_name": "Malloc disk", 00:20:58.169 "block_size": 512, 00:20:58.169 "num_blocks": 65536, 00:20:58.169 "uuid": "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5", 00:20:58.169 "assigned_rate_limits": { 00:20:58.169 "rw_ios_per_sec": 0, 00:20:58.169 "rw_mbytes_per_sec": 0, 00:20:58.169 "r_mbytes_per_sec": 0, 00:20:58.169 "w_mbytes_per_sec": 0 00:20:58.169 }, 00:20:58.169 "claimed": true, 00:20:58.169 "claim_type": "exclusive_write", 00:20:58.169 "zoned": false, 00:20:58.169 "supported_io_types": { 00:20:58.169 "read": true, 00:20:58.169 "write": true, 00:20:58.169 "unmap": true, 00:20:58.169 "flush": true, 00:20:58.169 "reset": true, 00:20:58.169 "nvme_admin": false, 00:20:58.169 "nvme_io": false, 00:20:58.169 "nvme_io_md": false, 00:20:58.169 "write_zeroes": true, 00:20:58.169 "zcopy": true, 00:20:58.169 "get_zone_info": false, 00:20:58.169 "zone_management": false, 00:20:58.169 "zone_append": false, 00:20:58.169 "compare": false, 00:20:58.169 "compare_and_write": false, 00:20:58.169 "abort": true, 00:20:58.169 "seek_hole": false, 00:20:58.169 "seek_data": false, 00:20:58.169 "copy": true, 00:20:58.169 "nvme_iov_md": false 00:20:58.169 }, 00:20:58.169 "memory_domains": [ 00:20:58.169 { 00:20:58.169 "dma_device_id": "system", 00:20:58.169 "dma_device_type": 1 00:20:58.169 }, 00:20:58.169 { 00:20:58.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.169 "dma_device_type": 2 00:20:58.169 } 00:20:58.169 ], 00:20:58.169 "driver_specific": {} 00:20:58.169 } 00:20:58.169 ] 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.169 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.428 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.428 "name": "Existed_Raid", 00:20:58.428 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:20:58.428 "strip_size_kb": 64, 00:20:58.428 "state": "configuring", 00:20:58.428 "raid_level": "concat", 00:20:58.428 "superblock": true, 00:20:58.428 "num_base_bdevs": 4, 00:20:58.428 "num_base_bdevs_discovered": 3, 00:20:58.429 "num_base_bdevs_operational": 4, 00:20:58.429 "base_bdevs_list": [ 00:20:58.429 { 00:20:58.429 "name": "BaseBdev1", 00:20:58.429 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:20:58.429 "is_configured": true, 00:20:58.429 "data_offset": 2048, 00:20:58.429 "data_size": 63488 00:20:58.429 }, 00:20:58.429 { 00:20:58.429 "name": "BaseBdev2", 00:20:58.429 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:20:58.429 "is_configured": true, 00:20:58.429 "data_offset": 2048, 00:20:58.429 "data_size": 63488 00:20:58.429 }, 00:20:58.429 { 00:20:58.429 "name": "BaseBdev3", 00:20:58.429 "uuid": "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5", 00:20:58.429 "is_configured": true, 00:20:58.429 "data_offset": 2048, 00:20:58.429 "data_size": 63488 00:20:58.429 }, 00:20:58.429 { 00:20:58.429 "name": "BaseBdev4", 00:20:58.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.429 "is_configured": false, 00:20:58.429 "data_offset": 0, 00:20:58.429 "data_size": 0 00:20:58.429 } 00:20:58.429 ] 00:20:58.429 }' 00:20:58.429 07:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.429 07:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.995 07:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:59.254 [2024-07-25 07:26:31.602798] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:59.254 [2024-07-25 07:26:31.602957] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a97b0 00:20:59.254 [2024-07-25 07:26:31.602970] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:59.254 [2024-07-25 07:26:31.603131] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x175c9d0 00:20:59.254 [2024-07-25 07:26:31.603254] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a97b0 00:20:59.254 [2024-07-25 07:26:31.603263] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15a97b0 00:20:59.254 [2024-07-25 07:26:31.603345] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:59.254 BaseBdev4 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:59.254 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.550 07:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:59.840 [ 00:20:59.840 { 00:20:59.840 "name": "BaseBdev4", 00:20:59.840 "aliases": [ 00:20:59.840 "f0cf1927-1db8-40b4-8945-dbaabfeaf05e" 00:20:59.840 ], 00:20:59.840 "product_name": "Malloc disk", 00:20:59.840 "block_size": 512, 00:20:59.840 "num_blocks": 65536, 00:20:59.840 "uuid": "f0cf1927-1db8-40b4-8945-dbaabfeaf05e", 00:20:59.840 "assigned_rate_limits": { 00:20:59.840 "rw_ios_per_sec": 0, 00:20:59.840 "rw_mbytes_per_sec": 0, 00:20:59.840 "r_mbytes_per_sec": 0, 00:20:59.840 "w_mbytes_per_sec": 0 00:20:59.840 }, 00:20:59.840 "claimed": true, 00:20:59.840 "claim_type": "exclusive_write", 00:20:59.840 "zoned": false, 00:20:59.840 "supported_io_types": { 00:20:59.840 "read": true, 00:20:59.840 "write": true, 00:20:59.840 "unmap": true, 00:20:59.840 "flush": true, 00:20:59.840 "reset": true, 00:20:59.840 "nvme_admin": false, 00:20:59.840 "nvme_io": false, 00:20:59.840 "nvme_io_md": false, 00:20:59.840 "write_zeroes": true, 00:20:59.840 "zcopy": true, 00:20:59.840 "get_zone_info": false, 00:20:59.840 "zone_management": false, 00:20:59.840 "zone_append": false, 00:20:59.840 "compare": false, 00:20:59.840 "compare_and_write": false, 00:20:59.840 "abort": true, 00:20:59.840 "seek_hole": false, 00:20:59.840 "seek_data": false, 00:20:59.840 "copy": true, 00:20:59.840 "nvme_iov_md": false 00:20:59.840 }, 00:20:59.840 "memory_domains": [ 00:20:59.840 { 00:20:59.840 "dma_device_id": "system", 00:20:59.840 "dma_device_type": 1 00:20:59.840 }, 00:20:59.840 { 00:20:59.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.840 "dma_device_type": 2 00:20:59.840 } 00:20:59.840 ], 00:20:59.840 "driver_specific": {} 00:20:59.841 } 00:20:59.841 ] 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.841 "name": "Existed_Raid", 00:20:59.841 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:20:59.841 "strip_size_kb": 64, 00:20:59.841 "state": "online", 00:20:59.841 "raid_level": "concat", 00:20:59.841 "superblock": true, 00:20:59.841 "num_base_bdevs": 4, 00:20:59.841 "num_base_bdevs_discovered": 4, 00:20:59.841 "num_base_bdevs_operational": 4, 00:20:59.841 "base_bdevs_list": [ 00:20:59.841 { 00:20:59.841 "name": "BaseBdev1", 00:20:59.841 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:20:59.841 "is_configured": true, 00:20:59.841 "data_offset": 2048, 00:20:59.841 "data_size": 63488 00:20:59.841 }, 00:20:59.841 { 00:20:59.841 "name": "BaseBdev2", 00:20:59.841 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:20:59.841 "is_configured": true, 00:20:59.841 "data_offset": 2048, 00:20:59.841 "data_size": 63488 00:20:59.841 }, 00:20:59.841 { 00:20:59.841 "name": "BaseBdev3", 00:20:59.841 "uuid": "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5", 00:20:59.841 "is_configured": true, 00:20:59.841 "data_offset": 2048, 00:20:59.841 "data_size": 63488 00:20:59.841 }, 00:20:59.841 { 00:20:59.841 "name": "BaseBdev4", 00:20:59.841 "uuid": "f0cf1927-1db8-40b4-8945-dbaabfeaf05e", 00:20:59.841 "is_configured": true, 00:20:59.841 "data_offset": 2048, 00:20:59.841 "data_size": 63488 00:20:59.841 } 00:20:59.841 ] 00:20:59.841 }' 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.841 07:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:00.776 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:01.034 [2024-07-25 07:26:33.351725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:01.034 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:01.034 "name": "Existed_Raid", 00:21:01.034 "aliases": [ 00:21:01.034 "46fcdd84-c911-433d-8398-6d9b116c18e6" 00:21:01.034 ], 00:21:01.034 "product_name": "Raid Volume", 00:21:01.034 "block_size": 512, 00:21:01.034 "num_blocks": 253952, 00:21:01.034 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:21:01.034 "assigned_rate_limits": { 00:21:01.034 "rw_ios_per_sec": 0, 00:21:01.034 "rw_mbytes_per_sec": 0, 00:21:01.034 "r_mbytes_per_sec": 0, 00:21:01.034 "w_mbytes_per_sec": 0 00:21:01.034 }, 00:21:01.034 "claimed": false, 00:21:01.034 "zoned": false, 00:21:01.034 "supported_io_types": { 00:21:01.034 "read": true, 00:21:01.034 "write": true, 00:21:01.034 "unmap": true, 00:21:01.034 "flush": true, 00:21:01.035 "reset": true, 00:21:01.035 "nvme_admin": false, 00:21:01.035 "nvme_io": false, 00:21:01.035 "nvme_io_md": false, 00:21:01.035 "write_zeroes": true, 00:21:01.035 "zcopy": false, 00:21:01.035 "get_zone_info": false, 00:21:01.035 "zone_management": false, 00:21:01.035 "zone_append": false, 00:21:01.035 "compare": false, 00:21:01.035 "compare_and_write": false, 00:21:01.035 "abort": false, 00:21:01.035 "seek_hole": false, 00:21:01.035 "seek_data": false, 00:21:01.035 "copy": false, 00:21:01.035 "nvme_iov_md": false 00:21:01.035 }, 00:21:01.035 "memory_domains": [ 00:21:01.035 { 00:21:01.035 "dma_device_id": "system", 00:21:01.035 "dma_device_type": 1 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.035 "dma_device_type": 2 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "system", 00:21:01.035 "dma_device_type": 1 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.035 "dma_device_type": 2 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "system", 00:21:01.035 "dma_device_type": 1 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.035 "dma_device_type": 2 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "system", 00:21:01.035 "dma_device_type": 1 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.035 "dma_device_type": 2 00:21:01.035 } 00:21:01.035 ], 00:21:01.035 "driver_specific": { 00:21:01.035 "raid": { 00:21:01.035 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:21:01.035 "strip_size_kb": 64, 00:21:01.035 "state": "online", 00:21:01.035 "raid_level": "concat", 00:21:01.035 "superblock": true, 00:21:01.035 "num_base_bdevs": 4, 00:21:01.035 "num_base_bdevs_discovered": 4, 00:21:01.035 "num_base_bdevs_operational": 4, 00:21:01.035 "base_bdevs_list": [ 00:21:01.035 { 00:21:01.035 "name": "BaseBdev1", 00:21:01.035 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:21:01.035 "is_configured": true, 00:21:01.035 "data_offset": 2048, 00:21:01.035 "data_size": 63488 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "name": "BaseBdev2", 00:21:01.035 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:21:01.035 "is_configured": true, 00:21:01.035 "data_offset": 2048, 00:21:01.035 "data_size": 63488 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "name": "BaseBdev3", 00:21:01.035 "uuid": "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5", 00:21:01.035 "is_configured": true, 00:21:01.035 "data_offset": 2048, 00:21:01.035 "data_size": 63488 00:21:01.035 }, 00:21:01.035 { 00:21:01.035 "name": "BaseBdev4", 00:21:01.035 "uuid": "f0cf1927-1db8-40b4-8945-dbaabfeaf05e", 00:21:01.035 "is_configured": true, 00:21:01.035 "data_offset": 2048, 00:21:01.035 "data_size": 63488 00:21:01.035 } 00:21:01.035 ] 00:21:01.035 } 00:21:01.035 } 00:21:01.035 }' 00:21:01.035 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:01.035 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:01.035 BaseBdev2 00:21:01.035 BaseBdev3 00:21:01.035 BaseBdev4' 00:21:01.035 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.035 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.035 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:01.293 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.293 "name": "BaseBdev1", 00:21:01.293 "aliases": [ 00:21:01.293 "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78" 00:21:01.293 ], 00:21:01.293 "product_name": "Malloc disk", 00:21:01.293 "block_size": 512, 00:21:01.293 "num_blocks": 65536, 00:21:01.294 "uuid": "a0ce77be-a1ac-44ea-8fd1-fd9a4eadbf78", 00:21:01.294 "assigned_rate_limits": { 00:21:01.294 "rw_ios_per_sec": 0, 00:21:01.294 "rw_mbytes_per_sec": 0, 00:21:01.294 "r_mbytes_per_sec": 0, 00:21:01.294 "w_mbytes_per_sec": 0 00:21:01.294 }, 00:21:01.294 "claimed": true, 00:21:01.294 "claim_type": "exclusive_write", 00:21:01.294 "zoned": false, 00:21:01.294 "supported_io_types": { 00:21:01.294 "read": true, 00:21:01.294 "write": true, 00:21:01.294 "unmap": true, 00:21:01.294 "flush": true, 00:21:01.294 "reset": true, 00:21:01.294 "nvme_admin": false, 00:21:01.294 "nvme_io": false, 00:21:01.294 "nvme_io_md": false, 00:21:01.294 "write_zeroes": true, 00:21:01.294 "zcopy": true, 00:21:01.294 "get_zone_info": false, 00:21:01.294 "zone_management": false, 00:21:01.294 "zone_append": false, 00:21:01.294 "compare": false, 00:21:01.294 "compare_and_write": false, 00:21:01.294 "abort": true, 00:21:01.294 "seek_hole": false, 00:21:01.294 "seek_data": false, 00:21:01.294 "copy": true, 00:21:01.294 "nvme_iov_md": false 00:21:01.294 }, 00:21:01.294 "memory_domains": [ 00:21:01.294 { 00:21:01.294 "dma_device_id": "system", 00:21:01.294 "dma_device_type": 1 00:21:01.294 }, 00:21:01.294 { 00:21:01.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.294 "dma_device_type": 2 00:21:01.294 } 00:21:01.294 ], 00:21:01.294 "driver_specific": {} 00:21:01.294 }' 00:21:01.294 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.294 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.294 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.294 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.552 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.552 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.552 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.552 07:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.552 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.552 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.552 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.811 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.811 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.811 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:01.811 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.811 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.811 "name": "BaseBdev2", 00:21:01.811 "aliases": [ 00:21:01.811 "7ba32a41-a764-41a1-a7d2-c08f3442c687" 00:21:01.811 ], 00:21:01.811 "product_name": "Malloc disk", 00:21:01.811 "block_size": 512, 00:21:01.811 "num_blocks": 65536, 00:21:01.811 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:21:01.811 "assigned_rate_limits": { 00:21:01.811 "rw_ios_per_sec": 0, 00:21:01.811 "rw_mbytes_per_sec": 0, 00:21:01.811 "r_mbytes_per_sec": 0, 00:21:01.811 "w_mbytes_per_sec": 0 00:21:01.811 }, 00:21:01.811 "claimed": true, 00:21:01.811 "claim_type": "exclusive_write", 00:21:01.811 "zoned": false, 00:21:01.811 "supported_io_types": { 00:21:01.811 "read": true, 00:21:01.811 "write": true, 00:21:01.811 "unmap": true, 00:21:01.811 "flush": true, 00:21:01.811 "reset": true, 00:21:01.811 "nvme_admin": false, 00:21:01.811 "nvme_io": false, 00:21:01.811 "nvme_io_md": false, 00:21:01.811 "write_zeroes": true, 00:21:01.811 "zcopy": true, 00:21:01.811 "get_zone_info": false, 00:21:01.811 "zone_management": false, 00:21:01.811 "zone_append": false, 00:21:01.811 "compare": false, 00:21:01.811 "compare_and_write": false, 00:21:01.811 "abort": true, 00:21:01.811 "seek_hole": false, 00:21:01.811 "seek_data": false, 00:21:01.811 "copy": true, 00:21:01.811 "nvme_iov_md": false 00:21:01.811 }, 00:21:01.811 "memory_domains": [ 00:21:01.811 { 00:21:01.811 "dma_device_id": "system", 00:21:01.811 "dma_device_type": 1 00:21:01.811 }, 00:21:01.811 { 00:21:01.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.811 "dma_device_type": 2 00:21:01.811 } 00:21:01.811 ], 00:21:01.811 "driver_specific": {} 00:21:01.811 }' 00:21:01.811 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.069 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.069 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.069 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.069 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.070 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.070 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.070 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.070 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.328 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.328 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.328 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.328 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.328 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:02.328 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.586 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.586 "name": "BaseBdev3", 00:21:02.586 "aliases": [ 00:21:02.586 "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5" 00:21:02.586 ], 00:21:02.586 "product_name": "Malloc disk", 00:21:02.586 "block_size": 512, 00:21:02.586 "num_blocks": 65536, 00:21:02.586 "uuid": "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5", 00:21:02.586 "assigned_rate_limits": { 00:21:02.586 "rw_ios_per_sec": 0, 00:21:02.586 "rw_mbytes_per_sec": 0, 00:21:02.586 "r_mbytes_per_sec": 0, 00:21:02.586 "w_mbytes_per_sec": 0 00:21:02.586 }, 00:21:02.586 "claimed": true, 00:21:02.586 "claim_type": "exclusive_write", 00:21:02.586 "zoned": false, 00:21:02.586 "supported_io_types": { 00:21:02.586 "read": true, 00:21:02.586 "write": true, 00:21:02.586 "unmap": true, 00:21:02.586 "flush": true, 00:21:02.586 "reset": true, 00:21:02.586 "nvme_admin": false, 00:21:02.586 "nvme_io": false, 00:21:02.586 "nvme_io_md": false, 00:21:02.586 "write_zeroes": true, 00:21:02.586 "zcopy": true, 00:21:02.586 "get_zone_info": false, 00:21:02.586 "zone_management": false, 00:21:02.586 "zone_append": false, 00:21:02.586 "compare": false, 00:21:02.586 "compare_and_write": false, 00:21:02.586 "abort": true, 00:21:02.586 "seek_hole": false, 00:21:02.586 "seek_data": false, 00:21:02.586 "copy": true, 00:21:02.586 "nvme_iov_md": false 00:21:02.586 }, 00:21:02.586 "memory_domains": [ 00:21:02.586 { 00:21:02.586 "dma_device_id": "system", 00:21:02.586 "dma_device_type": 1 00:21:02.586 }, 00:21:02.586 { 00:21:02.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.586 "dma_device_type": 2 00:21:02.586 } 00:21:02.586 ], 00:21:02.586 "driver_specific": {} 00:21:02.586 }' 00:21:02.586 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.586 07:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.586 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.586 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.586 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.586 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.587 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:02.845 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.103 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.103 "name": "BaseBdev4", 00:21:03.103 "aliases": [ 00:21:03.103 "f0cf1927-1db8-40b4-8945-dbaabfeaf05e" 00:21:03.103 ], 00:21:03.103 "product_name": "Malloc disk", 00:21:03.103 "block_size": 512, 00:21:03.103 "num_blocks": 65536, 00:21:03.103 "uuid": "f0cf1927-1db8-40b4-8945-dbaabfeaf05e", 00:21:03.103 "assigned_rate_limits": { 00:21:03.103 "rw_ios_per_sec": 0, 00:21:03.103 "rw_mbytes_per_sec": 0, 00:21:03.103 "r_mbytes_per_sec": 0, 00:21:03.104 "w_mbytes_per_sec": 0 00:21:03.104 }, 00:21:03.104 "claimed": true, 00:21:03.104 "claim_type": "exclusive_write", 00:21:03.104 "zoned": false, 00:21:03.104 "supported_io_types": { 00:21:03.104 "read": true, 00:21:03.104 "write": true, 00:21:03.104 "unmap": true, 00:21:03.104 "flush": true, 00:21:03.104 "reset": true, 00:21:03.104 "nvme_admin": false, 00:21:03.104 "nvme_io": false, 00:21:03.104 "nvme_io_md": false, 00:21:03.104 "write_zeroes": true, 00:21:03.104 "zcopy": true, 00:21:03.104 "get_zone_info": false, 00:21:03.104 "zone_management": false, 00:21:03.104 "zone_append": false, 00:21:03.104 "compare": false, 00:21:03.104 "compare_and_write": false, 00:21:03.104 "abort": true, 00:21:03.104 "seek_hole": false, 00:21:03.104 "seek_data": false, 00:21:03.104 "copy": true, 00:21:03.104 "nvme_iov_md": false 00:21:03.104 }, 00:21:03.104 "memory_domains": [ 00:21:03.104 { 00:21:03.104 "dma_device_id": "system", 00:21:03.104 "dma_device_type": 1 00:21:03.104 }, 00:21:03.104 { 00:21:03.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.104 "dma_device_type": 2 00:21:03.104 } 00:21:03.104 ], 00:21:03.104 "driver_specific": {} 00:21:03.104 }' 00:21:03.104 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.104 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.104 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.104 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.104 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.362 07:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:03.929 [2024-07-25 07:26:36.291204] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:03.929 [2024-07-25 07:26:36.291229] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:03.929 [2024-07-25 07:26:36.291272] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.929 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.189 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.189 "name": "Existed_Raid", 00:21:04.189 "uuid": "46fcdd84-c911-433d-8398-6d9b116c18e6", 00:21:04.189 "strip_size_kb": 64, 00:21:04.189 "state": "offline", 00:21:04.189 "raid_level": "concat", 00:21:04.189 "superblock": true, 00:21:04.189 "num_base_bdevs": 4, 00:21:04.189 "num_base_bdevs_discovered": 3, 00:21:04.189 "num_base_bdevs_operational": 3, 00:21:04.189 "base_bdevs_list": [ 00:21:04.189 { 00:21:04.189 "name": null, 00:21:04.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.189 "is_configured": false, 00:21:04.189 "data_offset": 2048, 00:21:04.189 "data_size": 63488 00:21:04.189 }, 00:21:04.189 { 00:21:04.189 "name": "BaseBdev2", 00:21:04.189 "uuid": "7ba32a41-a764-41a1-a7d2-c08f3442c687", 00:21:04.189 "is_configured": true, 00:21:04.189 "data_offset": 2048, 00:21:04.189 "data_size": 63488 00:21:04.189 }, 00:21:04.189 { 00:21:04.189 "name": "BaseBdev3", 00:21:04.189 "uuid": "3b0eb185-f757-4c97-b2a7-c2d87ebee6f5", 00:21:04.189 "is_configured": true, 00:21:04.189 "data_offset": 2048, 00:21:04.189 "data_size": 63488 00:21:04.189 }, 00:21:04.189 { 00:21:04.189 "name": "BaseBdev4", 00:21:04.189 "uuid": "f0cf1927-1db8-40b4-8945-dbaabfeaf05e", 00:21:04.189 "is_configured": true, 00:21:04.189 "data_offset": 2048, 00:21:04.189 "data_size": 63488 00:21:04.189 } 00:21:04.189 ] 00:21:04.189 }' 00:21:04.189 07:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.189 07:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.756 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:04.756 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:04.756 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.756 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.014 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.014 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.015 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:05.273 [2024-07-25 07:26:37.567641] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:05.273 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:05.273 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:05.273 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.273 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.532 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.532 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.532 07:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:05.532 [2024-07-25 07:26:38.022813] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:05.532 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:05.532 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:05.532 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.532 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.791 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.791 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.791 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:06.050 [2024-07-25 07:26:38.485911] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:06.050 [2024-07-25 07:26:38.485949] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a97b0 name Existed_Raid, state offline 00:21:06.050 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:06.050 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:06.050 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.050 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:06.309 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:06.309 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:06.310 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:06.310 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:06.310 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:06.310 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:06.568 BaseBdev2 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:06.568 07:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.826 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:07.085 [ 00:21:07.085 { 00:21:07.085 "name": "BaseBdev2", 00:21:07.085 "aliases": [ 00:21:07.085 "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e" 00:21:07.085 ], 00:21:07.085 "product_name": "Malloc disk", 00:21:07.085 "block_size": 512, 00:21:07.085 "num_blocks": 65536, 00:21:07.085 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:07.085 "assigned_rate_limits": { 00:21:07.085 "rw_ios_per_sec": 0, 00:21:07.085 "rw_mbytes_per_sec": 0, 00:21:07.085 "r_mbytes_per_sec": 0, 00:21:07.085 "w_mbytes_per_sec": 0 00:21:07.085 }, 00:21:07.085 "claimed": false, 00:21:07.085 "zoned": false, 00:21:07.085 "supported_io_types": { 00:21:07.085 "read": true, 00:21:07.085 "write": true, 00:21:07.085 "unmap": true, 00:21:07.085 "flush": true, 00:21:07.085 "reset": true, 00:21:07.085 "nvme_admin": false, 00:21:07.085 "nvme_io": false, 00:21:07.085 "nvme_io_md": false, 00:21:07.085 "write_zeroes": true, 00:21:07.085 "zcopy": true, 00:21:07.085 "get_zone_info": false, 00:21:07.085 "zone_management": false, 00:21:07.085 "zone_append": false, 00:21:07.085 "compare": false, 00:21:07.085 "compare_and_write": false, 00:21:07.085 "abort": true, 00:21:07.085 "seek_hole": false, 00:21:07.085 "seek_data": false, 00:21:07.085 "copy": true, 00:21:07.085 "nvme_iov_md": false 00:21:07.085 }, 00:21:07.085 "memory_domains": [ 00:21:07.085 { 00:21:07.085 "dma_device_id": "system", 00:21:07.085 "dma_device_type": 1 00:21:07.085 }, 00:21:07.085 { 00:21:07.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.085 "dma_device_type": 2 00:21:07.085 } 00:21:07.085 ], 00:21:07.085 "driver_specific": {} 00:21:07.085 } 00:21:07.085 ] 00:21:07.085 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:07.085 07:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:07.085 07:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:07.086 07:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:07.344 BaseBdev3 00:21:07.344 07:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:07.345 07:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:07.602 [ 00:21:07.602 { 00:21:07.602 "name": "BaseBdev3", 00:21:07.602 "aliases": [ 00:21:07.602 "a5d6ade4-6a83-4b27-8972-b0864488892a" 00:21:07.602 ], 00:21:07.602 "product_name": "Malloc disk", 00:21:07.602 "block_size": 512, 00:21:07.602 "num_blocks": 65536, 00:21:07.603 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:07.603 "assigned_rate_limits": { 00:21:07.603 "rw_ios_per_sec": 0, 00:21:07.603 "rw_mbytes_per_sec": 0, 00:21:07.603 "r_mbytes_per_sec": 0, 00:21:07.603 "w_mbytes_per_sec": 0 00:21:07.603 }, 00:21:07.603 "claimed": false, 00:21:07.603 "zoned": false, 00:21:07.603 "supported_io_types": { 00:21:07.603 "read": true, 00:21:07.603 "write": true, 00:21:07.603 "unmap": true, 00:21:07.603 "flush": true, 00:21:07.603 "reset": true, 00:21:07.603 "nvme_admin": false, 00:21:07.603 "nvme_io": false, 00:21:07.603 "nvme_io_md": false, 00:21:07.603 "write_zeroes": true, 00:21:07.603 "zcopy": true, 00:21:07.603 "get_zone_info": false, 00:21:07.603 "zone_management": false, 00:21:07.603 "zone_append": false, 00:21:07.603 "compare": false, 00:21:07.603 "compare_and_write": false, 00:21:07.603 "abort": true, 00:21:07.603 "seek_hole": false, 00:21:07.603 "seek_data": false, 00:21:07.603 "copy": true, 00:21:07.603 "nvme_iov_md": false 00:21:07.603 }, 00:21:07.603 "memory_domains": [ 00:21:07.603 { 00:21:07.603 "dma_device_id": "system", 00:21:07.603 "dma_device_type": 1 00:21:07.603 }, 00:21:07.603 { 00:21:07.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.603 "dma_device_type": 2 00:21:07.603 } 00:21:07.603 ], 00:21:07.603 "driver_specific": {} 00:21:07.603 } 00:21:07.603 ] 00:21:07.603 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:07.603 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:07.603 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:07.603 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:07.861 BaseBdev4 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:07.861 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:08.120 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:08.379 [ 00:21:08.379 { 00:21:08.379 "name": "BaseBdev4", 00:21:08.379 "aliases": [ 00:21:08.379 "c449c925-4b0f-4047-99f8-3a476c0679ff" 00:21:08.379 ], 00:21:08.379 "product_name": "Malloc disk", 00:21:08.379 "block_size": 512, 00:21:08.379 "num_blocks": 65536, 00:21:08.379 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:08.379 "assigned_rate_limits": { 00:21:08.379 "rw_ios_per_sec": 0, 00:21:08.379 "rw_mbytes_per_sec": 0, 00:21:08.379 "r_mbytes_per_sec": 0, 00:21:08.379 "w_mbytes_per_sec": 0 00:21:08.379 }, 00:21:08.379 "claimed": false, 00:21:08.379 "zoned": false, 00:21:08.379 "supported_io_types": { 00:21:08.379 "read": true, 00:21:08.379 "write": true, 00:21:08.379 "unmap": true, 00:21:08.379 "flush": true, 00:21:08.379 "reset": true, 00:21:08.379 "nvme_admin": false, 00:21:08.379 "nvme_io": false, 00:21:08.379 "nvme_io_md": false, 00:21:08.379 "write_zeroes": true, 00:21:08.379 "zcopy": true, 00:21:08.379 "get_zone_info": false, 00:21:08.379 "zone_management": false, 00:21:08.379 "zone_append": false, 00:21:08.379 "compare": false, 00:21:08.379 "compare_and_write": false, 00:21:08.379 "abort": true, 00:21:08.379 "seek_hole": false, 00:21:08.379 "seek_data": false, 00:21:08.379 "copy": true, 00:21:08.379 "nvme_iov_md": false 00:21:08.379 }, 00:21:08.379 "memory_domains": [ 00:21:08.379 { 00:21:08.379 "dma_device_id": "system", 00:21:08.379 "dma_device_type": 1 00:21:08.379 }, 00:21:08.379 { 00:21:08.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.379 "dma_device_type": 2 00:21:08.379 } 00:21:08.379 ], 00:21:08.379 "driver_specific": {} 00:21:08.379 } 00:21:08.379 ] 00:21:08.379 07:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:08.379 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:08.379 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:08.379 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:08.638 [2024-07-25 07:26:40.975387] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:08.638 [2024-07-25 07:26:40.975423] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:08.638 [2024-07-25 07:26:40.975440] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.638 [2024-07-25 07:26:40.976619] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:08.638 [2024-07-25 07:26:40.976659] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.638 07:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:08.897 07:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.897 "name": "Existed_Raid", 00:21:08.897 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:08.897 "strip_size_kb": 64, 00:21:08.897 "state": "configuring", 00:21:08.897 "raid_level": "concat", 00:21:08.897 "superblock": true, 00:21:08.897 "num_base_bdevs": 4, 00:21:08.897 "num_base_bdevs_discovered": 3, 00:21:08.897 "num_base_bdevs_operational": 4, 00:21:08.897 "base_bdevs_list": [ 00:21:08.897 { 00:21:08.897 "name": "BaseBdev1", 00:21:08.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.897 "is_configured": false, 00:21:08.897 "data_offset": 0, 00:21:08.897 "data_size": 0 00:21:08.897 }, 00:21:08.897 { 00:21:08.897 "name": "BaseBdev2", 00:21:08.897 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:08.897 "is_configured": true, 00:21:08.897 "data_offset": 2048, 00:21:08.897 "data_size": 63488 00:21:08.897 }, 00:21:08.897 { 00:21:08.897 "name": "BaseBdev3", 00:21:08.897 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:08.897 "is_configured": true, 00:21:08.897 "data_offset": 2048, 00:21:08.897 "data_size": 63488 00:21:08.897 }, 00:21:08.897 { 00:21:08.897 "name": "BaseBdev4", 00:21:08.897 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:08.897 "is_configured": true, 00:21:08.897 "data_offset": 2048, 00:21:08.897 "data_size": 63488 00:21:08.897 } 00:21:08.897 ] 00:21:08.897 }' 00:21:08.897 07:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.897 07:26:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:09.464 07:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:09.723 [2024-07-25 07:26:42.002070] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:09.723 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.724 "name": "Existed_Raid", 00:21:09.724 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:09.724 "strip_size_kb": 64, 00:21:09.724 "state": "configuring", 00:21:09.724 "raid_level": "concat", 00:21:09.724 "superblock": true, 00:21:09.724 "num_base_bdevs": 4, 00:21:09.724 "num_base_bdevs_discovered": 2, 00:21:09.724 "num_base_bdevs_operational": 4, 00:21:09.724 "base_bdevs_list": [ 00:21:09.724 { 00:21:09.724 "name": "BaseBdev1", 00:21:09.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.724 "is_configured": false, 00:21:09.724 "data_offset": 0, 00:21:09.724 "data_size": 0 00:21:09.724 }, 00:21:09.724 { 00:21:09.724 "name": null, 00:21:09.724 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:09.724 "is_configured": false, 00:21:09.724 "data_offset": 2048, 00:21:09.724 "data_size": 63488 00:21:09.724 }, 00:21:09.724 { 00:21:09.724 "name": "BaseBdev3", 00:21:09.724 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:09.724 "is_configured": true, 00:21:09.724 "data_offset": 2048, 00:21:09.724 "data_size": 63488 00:21:09.724 }, 00:21:09.724 { 00:21:09.724 "name": "BaseBdev4", 00:21:09.724 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:09.724 "is_configured": true, 00:21:09.724 "data_offset": 2048, 00:21:09.724 "data_size": 63488 00:21:09.724 } 00:21:09.724 ] 00:21:09.724 }' 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.724 07:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:10.659 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.659 07:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:10.659 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:10.659 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:10.918 [2024-07-25 07:26:43.276785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:10.918 BaseBdev1 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:10.918 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:11.177 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:11.436 [ 00:21:11.436 { 00:21:11.436 "name": "BaseBdev1", 00:21:11.436 "aliases": [ 00:21:11.436 "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d" 00:21:11.436 ], 00:21:11.436 "product_name": "Malloc disk", 00:21:11.436 "block_size": 512, 00:21:11.436 "num_blocks": 65536, 00:21:11.436 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:11.436 "assigned_rate_limits": { 00:21:11.436 "rw_ios_per_sec": 0, 00:21:11.436 "rw_mbytes_per_sec": 0, 00:21:11.436 "r_mbytes_per_sec": 0, 00:21:11.436 "w_mbytes_per_sec": 0 00:21:11.436 }, 00:21:11.436 "claimed": true, 00:21:11.436 "claim_type": "exclusive_write", 00:21:11.436 "zoned": false, 00:21:11.436 "supported_io_types": { 00:21:11.436 "read": true, 00:21:11.436 "write": true, 00:21:11.436 "unmap": true, 00:21:11.436 "flush": true, 00:21:11.436 "reset": true, 00:21:11.436 "nvme_admin": false, 00:21:11.436 "nvme_io": false, 00:21:11.436 "nvme_io_md": false, 00:21:11.436 "write_zeroes": true, 00:21:11.436 "zcopy": true, 00:21:11.436 "get_zone_info": false, 00:21:11.436 "zone_management": false, 00:21:11.436 "zone_append": false, 00:21:11.436 "compare": false, 00:21:11.436 "compare_and_write": false, 00:21:11.436 "abort": true, 00:21:11.436 "seek_hole": false, 00:21:11.436 "seek_data": false, 00:21:11.436 "copy": true, 00:21:11.436 "nvme_iov_md": false 00:21:11.436 }, 00:21:11.436 "memory_domains": [ 00:21:11.436 { 00:21:11.436 "dma_device_id": "system", 00:21:11.436 "dma_device_type": 1 00:21:11.436 }, 00:21:11.436 { 00:21:11.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.436 "dma_device_type": 2 00:21:11.436 } 00:21:11.436 ], 00:21:11.436 "driver_specific": {} 00:21:11.436 } 00:21:11.436 ] 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.436 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:11.695 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:11.695 "name": "Existed_Raid", 00:21:11.695 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:11.695 "strip_size_kb": 64, 00:21:11.695 "state": "configuring", 00:21:11.695 "raid_level": "concat", 00:21:11.695 "superblock": true, 00:21:11.695 "num_base_bdevs": 4, 00:21:11.695 "num_base_bdevs_discovered": 3, 00:21:11.695 "num_base_bdevs_operational": 4, 00:21:11.695 "base_bdevs_list": [ 00:21:11.695 { 00:21:11.695 "name": "BaseBdev1", 00:21:11.695 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:11.695 "is_configured": true, 00:21:11.695 "data_offset": 2048, 00:21:11.695 "data_size": 63488 00:21:11.695 }, 00:21:11.695 { 00:21:11.695 "name": null, 00:21:11.695 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:11.695 "is_configured": false, 00:21:11.695 "data_offset": 2048, 00:21:11.695 "data_size": 63488 00:21:11.695 }, 00:21:11.695 { 00:21:11.695 "name": "BaseBdev3", 00:21:11.695 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:11.695 "is_configured": true, 00:21:11.695 "data_offset": 2048, 00:21:11.695 "data_size": 63488 00:21:11.695 }, 00:21:11.695 { 00:21:11.695 "name": "BaseBdev4", 00:21:11.695 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:11.695 "is_configured": true, 00:21:11.695 "data_offset": 2048, 00:21:11.695 "data_size": 63488 00:21:11.695 } 00:21:11.695 ] 00:21:11.695 }' 00:21:11.695 07:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:11.695 07:26:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:12.263 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:12.263 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.263 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:12.263 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:12.524 [2024-07-25 07:26:44.941202] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.524 07:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.817 07:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.817 "name": "Existed_Raid", 00:21:12.817 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:12.817 "strip_size_kb": 64, 00:21:12.817 "state": "configuring", 00:21:12.817 "raid_level": "concat", 00:21:12.817 "superblock": true, 00:21:12.817 "num_base_bdevs": 4, 00:21:12.817 "num_base_bdevs_discovered": 2, 00:21:12.817 "num_base_bdevs_operational": 4, 00:21:12.817 "base_bdevs_list": [ 00:21:12.817 { 00:21:12.817 "name": "BaseBdev1", 00:21:12.817 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:12.817 "is_configured": true, 00:21:12.817 "data_offset": 2048, 00:21:12.817 "data_size": 63488 00:21:12.817 }, 00:21:12.817 { 00:21:12.817 "name": null, 00:21:12.817 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:12.817 "is_configured": false, 00:21:12.817 "data_offset": 2048, 00:21:12.817 "data_size": 63488 00:21:12.817 }, 00:21:12.817 { 00:21:12.817 "name": null, 00:21:12.817 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:12.817 "is_configured": false, 00:21:12.817 "data_offset": 2048, 00:21:12.817 "data_size": 63488 00:21:12.817 }, 00:21:12.817 { 00:21:12.817 "name": "BaseBdev4", 00:21:12.817 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:12.817 "is_configured": true, 00:21:12.817 "data_offset": 2048, 00:21:12.817 "data_size": 63488 00:21:12.817 } 00:21:12.817 ] 00:21:12.817 }' 00:21:12.817 07:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.817 07:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:13.383 07:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.383 07:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:13.642 07:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:13.642 07:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:13.901 [2024-07-25 07:26:46.196528] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:13.901 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.160 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.160 "name": "Existed_Raid", 00:21:14.160 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:14.160 "strip_size_kb": 64, 00:21:14.160 "state": "configuring", 00:21:14.160 "raid_level": "concat", 00:21:14.160 "superblock": true, 00:21:14.160 "num_base_bdevs": 4, 00:21:14.160 "num_base_bdevs_discovered": 3, 00:21:14.160 "num_base_bdevs_operational": 4, 00:21:14.160 "base_bdevs_list": [ 00:21:14.160 { 00:21:14.160 "name": "BaseBdev1", 00:21:14.160 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:14.160 "is_configured": true, 00:21:14.160 "data_offset": 2048, 00:21:14.160 "data_size": 63488 00:21:14.160 }, 00:21:14.160 { 00:21:14.160 "name": null, 00:21:14.160 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:14.160 "is_configured": false, 00:21:14.160 "data_offset": 2048, 00:21:14.160 "data_size": 63488 00:21:14.160 }, 00:21:14.160 { 00:21:14.160 "name": "BaseBdev3", 00:21:14.160 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:14.160 "is_configured": true, 00:21:14.160 "data_offset": 2048, 00:21:14.160 "data_size": 63488 00:21:14.160 }, 00:21:14.160 { 00:21:14.160 "name": "BaseBdev4", 00:21:14.160 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:14.160 "is_configured": true, 00:21:14.160 "data_offset": 2048, 00:21:14.160 "data_size": 63488 00:21:14.160 } 00:21:14.160 ] 00:21:14.160 }' 00:21:14.160 07:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.160 07:26:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:14.727 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.727 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:14.727 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:14.727 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:14.986 [2024-07-25 07:26:47.455849] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.986 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.245 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.245 "name": "Existed_Raid", 00:21:15.245 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:15.245 "strip_size_kb": 64, 00:21:15.245 "state": "configuring", 00:21:15.245 "raid_level": "concat", 00:21:15.245 "superblock": true, 00:21:15.245 "num_base_bdevs": 4, 00:21:15.245 "num_base_bdevs_discovered": 2, 00:21:15.245 "num_base_bdevs_operational": 4, 00:21:15.245 "base_bdevs_list": [ 00:21:15.245 { 00:21:15.245 "name": null, 00:21:15.245 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:15.245 "is_configured": false, 00:21:15.245 "data_offset": 2048, 00:21:15.245 "data_size": 63488 00:21:15.245 }, 00:21:15.245 { 00:21:15.245 "name": null, 00:21:15.245 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:15.245 "is_configured": false, 00:21:15.245 "data_offset": 2048, 00:21:15.245 "data_size": 63488 00:21:15.245 }, 00:21:15.245 { 00:21:15.245 "name": "BaseBdev3", 00:21:15.245 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:15.245 "is_configured": true, 00:21:15.245 "data_offset": 2048, 00:21:15.245 "data_size": 63488 00:21:15.245 }, 00:21:15.245 { 00:21:15.245 "name": "BaseBdev4", 00:21:15.245 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:15.245 "is_configured": true, 00:21:15.245 "data_offset": 2048, 00:21:15.245 "data_size": 63488 00:21:15.245 } 00:21:15.245 ] 00:21:15.245 }' 00:21:15.245 07:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.245 07:26:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:15.812 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.813 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:16.071 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:16.071 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:16.330 [2024-07-25 07:26:48.717370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.330 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.589 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.589 "name": "Existed_Raid", 00:21:16.589 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:16.589 "strip_size_kb": 64, 00:21:16.589 "state": "configuring", 00:21:16.589 "raid_level": "concat", 00:21:16.589 "superblock": true, 00:21:16.589 "num_base_bdevs": 4, 00:21:16.589 "num_base_bdevs_discovered": 3, 00:21:16.589 "num_base_bdevs_operational": 4, 00:21:16.589 "base_bdevs_list": [ 00:21:16.589 { 00:21:16.589 "name": null, 00:21:16.589 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:16.589 "is_configured": false, 00:21:16.589 "data_offset": 2048, 00:21:16.589 "data_size": 63488 00:21:16.589 }, 00:21:16.589 { 00:21:16.589 "name": "BaseBdev2", 00:21:16.589 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:16.589 "is_configured": true, 00:21:16.589 "data_offset": 2048, 00:21:16.589 "data_size": 63488 00:21:16.589 }, 00:21:16.589 { 00:21:16.589 "name": "BaseBdev3", 00:21:16.589 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:16.589 "is_configured": true, 00:21:16.589 "data_offset": 2048, 00:21:16.589 "data_size": 63488 00:21:16.589 }, 00:21:16.589 { 00:21:16.589 "name": "BaseBdev4", 00:21:16.589 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:16.589 "is_configured": true, 00:21:16.589 "data_offset": 2048, 00:21:16.589 "data_size": 63488 00:21:16.589 } 00:21:16.589 ] 00:21:16.589 }' 00:21:16.589 07:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.589 07:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:17.157 07:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:17.157 07:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.416 07:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:17.416 07:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.416 07:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:17.675 07:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d 00:21:17.675 [2024-07-25 07:26:50.188305] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:17.675 [2024-07-25 07:26:50.188454] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a0a30 00:21:17.675 [2024-07-25 07:26:50.188466] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:17.675 [2024-07-25 07:26:50.188622] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a9740 00:21:17.675 [2024-07-25 07:26:50.188728] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a0a30 00:21:17.675 [2024-07-25 07:26:50.188737] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15a0a30 00:21:17.675 [2024-07-25 07:26:50.188817] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:17.675 NewBaseBdev 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:17.675 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:17.934 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:18.193 [ 00:21:18.193 { 00:21:18.193 "name": "NewBaseBdev", 00:21:18.193 "aliases": [ 00:21:18.193 "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d" 00:21:18.193 ], 00:21:18.193 "product_name": "Malloc disk", 00:21:18.193 "block_size": 512, 00:21:18.193 "num_blocks": 65536, 00:21:18.193 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:18.193 "assigned_rate_limits": { 00:21:18.193 "rw_ios_per_sec": 0, 00:21:18.193 "rw_mbytes_per_sec": 0, 00:21:18.193 "r_mbytes_per_sec": 0, 00:21:18.193 "w_mbytes_per_sec": 0 00:21:18.193 }, 00:21:18.193 "claimed": true, 00:21:18.193 "claim_type": "exclusive_write", 00:21:18.193 "zoned": false, 00:21:18.193 "supported_io_types": { 00:21:18.193 "read": true, 00:21:18.193 "write": true, 00:21:18.193 "unmap": true, 00:21:18.193 "flush": true, 00:21:18.193 "reset": true, 00:21:18.193 "nvme_admin": false, 00:21:18.193 "nvme_io": false, 00:21:18.193 "nvme_io_md": false, 00:21:18.193 "write_zeroes": true, 00:21:18.193 "zcopy": true, 00:21:18.193 "get_zone_info": false, 00:21:18.193 "zone_management": false, 00:21:18.193 "zone_append": false, 00:21:18.193 "compare": false, 00:21:18.193 "compare_and_write": false, 00:21:18.193 "abort": true, 00:21:18.193 "seek_hole": false, 00:21:18.193 "seek_data": false, 00:21:18.193 "copy": true, 00:21:18.193 "nvme_iov_md": false 00:21:18.193 }, 00:21:18.193 "memory_domains": [ 00:21:18.193 { 00:21:18.193 "dma_device_id": "system", 00:21:18.193 "dma_device_type": 1 00:21:18.193 }, 00:21:18.193 { 00:21:18.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.193 "dma_device_type": 2 00:21:18.193 } 00:21:18.193 ], 00:21:18.193 "driver_specific": {} 00:21:18.193 } 00:21:18.193 ] 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.193 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:18.452 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.452 "name": "Existed_Raid", 00:21:18.452 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:18.452 "strip_size_kb": 64, 00:21:18.452 "state": "online", 00:21:18.452 "raid_level": "concat", 00:21:18.452 "superblock": true, 00:21:18.452 "num_base_bdevs": 4, 00:21:18.452 "num_base_bdevs_discovered": 4, 00:21:18.452 "num_base_bdevs_operational": 4, 00:21:18.452 "base_bdevs_list": [ 00:21:18.452 { 00:21:18.452 "name": "NewBaseBdev", 00:21:18.452 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 }, 00:21:18.452 { 00:21:18.452 "name": "BaseBdev2", 00:21:18.452 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 }, 00:21:18.452 { 00:21:18.452 "name": "BaseBdev3", 00:21:18.452 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 }, 00:21:18.452 { 00:21:18.452 "name": "BaseBdev4", 00:21:18.452 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 } 00:21:18.452 ] 00:21:18.452 }' 00:21:18.452 07:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.452 07:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:19.019 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:19.278 [2024-07-25 07:26:51.672507] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:19.278 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:19.278 "name": "Existed_Raid", 00:21:19.278 "aliases": [ 00:21:19.278 "0290c75d-d139-44ce-9c55-abb584e48635" 00:21:19.278 ], 00:21:19.278 "product_name": "Raid Volume", 00:21:19.278 "block_size": 512, 00:21:19.278 "num_blocks": 253952, 00:21:19.278 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:19.278 "assigned_rate_limits": { 00:21:19.278 "rw_ios_per_sec": 0, 00:21:19.278 "rw_mbytes_per_sec": 0, 00:21:19.278 "r_mbytes_per_sec": 0, 00:21:19.278 "w_mbytes_per_sec": 0 00:21:19.278 }, 00:21:19.278 "claimed": false, 00:21:19.278 "zoned": false, 00:21:19.278 "supported_io_types": { 00:21:19.278 "read": true, 00:21:19.278 "write": true, 00:21:19.278 "unmap": true, 00:21:19.278 "flush": true, 00:21:19.278 "reset": true, 00:21:19.278 "nvme_admin": false, 00:21:19.278 "nvme_io": false, 00:21:19.278 "nvme_io_md": false, 00:21:19.278 "write_zeroes": true, 00:21:19.278 "zcopy": false, 00:21:19.278 "get_zone_info": false, 00:21:19.278 "zone_management": false, 00:21:19.278 "zone_append": false, 00:21:19.278 "compare": false, 00:21:19.278 "compare_and_write": false, 00:21:19.278 "abort": false, 00:21:19.278 "seek_hole": false, 00:21:19.278 "seek_data": false, 00:21:19.278 "copy": false, 00:21:19.278 "nvme_iov_md": false 00:21:19.278 }, 00:21:19.278 "memory_domains": [ 00:21:19.279 { 00:21:19.279 "dma_device_id": "system", 00:21:19.279 "dma_device_type": 1 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.279 "dma_device_type": 2 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "system", 00:21:19.279 "dma_device_type": 1 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.279 "dma_device_type": 2 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "system", 00:21:19.279 "dma_device_type": 1 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.279 "dma_device_type": 2 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "system", 00:21:19.279 "dma_device_type": 1 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.279 "dma_device_type": 2 00:21:19.279 } 00:21:19.279 ], 00:21:19.279 "driver_specific": { 00:21:19.279 "raid": { 00:21:19.279 "uuid": "0290c75d-d139-44ce-9c55-abb584e48635", 00:21:19.279 "strip_size_kb": 64, 00:21:19.279 "state": "online", 00:21:19.279 "raid_level": "concat", 00:21:19.279 "superblock": true, 00:21:19.279 "num_base_bdevs": 4, 00:21:19.279 "num_base_bdevs_discovered": 4, 00:21:19.279 "num_base_bdevs_operational": 4, 00:21:19.279 "base_bdevs_list": [ 00:21:19.279 { 00:21:19.279 "name": "NewBaseBdev", 00:21:19.279 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:19.279 "is_configured": true, 00:21:19.279 "data_offset": 2048, 00:21:19.279 "data_size": 63488 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "name": "BaseBdev2", 00:21:19.279 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:19.279 "is_configured": true, 00:21:19.279 "data_offset": 2048, 00:21:19.279 "data_size": 63488 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "name": "BaseBdev3", 00:21:19.279 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:19.279 "is_configured": true, 00:21:19.279 "data_offset": 2048, 00:21:19.279 "data_size": 63488 00:21:19.279 }, 00:21:19.279 { 00:21:19.279 "name": "BaseBdev4", 00:21:19.279 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:19.279 "is_configured": true, 00:21:19.279 "data_offset": 2048, 00:21:19.279 "data_size": 63488 00:21:19.279 } 00:21:19.279 ] 00:21:19.279 } 00:21:19.279 } 00:21:19.279 }' 00:21:19.279 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:19.279 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:19.279 BaseBdev2 00:21:19.279 BaseBdev3 00:21:19.279 BaseBdev4' 00:21:19.279 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.279 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:19.279 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:19.537 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:19.537 "name": "NewBaseBdev", 00:21:19.537 "aliases": [ 00:21:19.537 "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d" 00:21:19.537 ], 00:21:19.537 "product_name": "Malloc disk", 00:21:19.537 "block_size": 512, 00:21:19.537 "num_blocks": 65536, 00:21:19.537 "uuid": "9ee3d6d2-a0a6-4a3d-a1ab-40f9dd8d9c1d", 00:21:19.537 "assigned_rate_limits": { 00:21:19.537 "rw_ios_per_sec": 0, 00:21:19.537 "rw_mbytes_per_sec": 0, 00:21:19.537 "r_mbytes_per_sec": 0, 00:21:19.537 "w_mbytes_per_sec": 0 00:21:19.537 }, 00:21:19.537 "claimed": true, 00:21:19.537 "claim_type": "exclusive_write", 00:21:19.537 "zoned": false, 00:21:19.537 "supported_io_types": { 00:21:19.537 "read": true, 00:21:19.537 "write": true, 00:21:19.537 "unmap": true, 00:21:19.537 "flush": true, 00:21:19.537 "reset": true, 00:21:19.537 "nvme_admin": false, 00:21:19.537 "nvme_io": false, 00:21:19.537 "nvme_io_md": false, 00:21:19.537 "write_zeroes": true, 00:21:19.537 "zcopy": true, 00:21:19.537 "get_zone_info": false, 00:21:19.537 "zone_management": false, 00:21:19.537 "zone_append": false, 00:21:19.537 "compare": false, 00:21:19.537 "compare_and_write": false, 00:21:19.537 "abort": true, 00:21:19.537 "seek_hole": false, 00:21:19.537 "seek_data": false, 00:21:19.537 "copy": true, 00:21:19.537 "nvme_iov_md": false 00:21:19.537 }, 00:21:19.537 "memory_domains": [ 00:21:19.537 { 00:21:19.537 "dma_device_id": "system", 00:21:19.537 "dma_device_type": 1 00:21:19.537 }, 00:21:19.537 { 00:21:19.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.537 "dma_device_type": 2 00:21:19.537 } 00:21:19.537 ], 00:21:19.537 "driver_specific": {} 00:21:19.537 }' 00:21:19.537 07:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.537 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.537 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:19.538 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:19.796 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:20.055 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:20.055 "name": "BaseBdev2", 00:21:20.055 "aliases": [ 00:21:20.055 "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e" 00:21:20.055 ], 00:21:20.055 "product_name": "Malloc disk", 00:21:20.055 "block_size": 512, 00:21:20.055 "num_blocks": 65536, 00:21:20.055 "uuid": "c4c09cd8-af28-48a7-bc43-1f2378ecaf7e", 00:21:20.055 "assigned_rate_limits": { 00:21:20.055 "rw_ios_per_sec": 0, 00:21:20.055 "rw_mbytes_per_sec": 0, 00:21:20.055 "r_mbytes_per_sec": 0, 00:21:20.055 "w_mbytes_per_sec": 0 00:21:20.055 }, 00:21:20.055 "claimed": true, 00:21:20.055 "claim_type": "exclusive_write", 00:21:20.055 "zoned": false, 00:21:20.055 "supported_io_types": { 00:21:20.055 "read": true, 00:21:20.055 "write": true, 00:21:20.055 "unmap": true, 00:21:20.055 "flush": true, 00:21:20.055 "reset": true, 00:21:20.055 "nvme_admin": false, 00:21:20.055 "nvme_io": false, 00:21:20.055 "nvme_io_md": false, 00:21:20.055 "write_zeroes": true, 00:21:20.055 "zcopy": true, 00:21:20.055 "get_zone_info": false, 00:21:20.055 "zone_management": false, 00:21:20.055 "zone_append": false, 00:21:20.055 "compare": false, 00:21:20.055 "compare_and_write": false, 00:21:20.055 "abort": true, 00:21:20.055 "seek_hole": false, 00:21:20.055 "seek_data": false, 00:21:20.055 "copy": true, 00:21:20.055 "nvme_iov_md": false 00:21:20.055 }, 00:21:20.055 "memory_domains": [ 00:21:20.055 { 00:21:20.055 "dma_device_id": "system", 00:21:20.055 "dma_device_type": 1 00:21:20.055 }, 00:21:20.055 { 00:21:20.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.055 "dma_device_type": 2 00:21:20.055 } 00:21:20.055 ], 00:21:20.055 "driver_specific": {} 00:21:20.055 }' 00:21:20.055 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:20.313 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.572 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.572 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:20.572 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:20.572 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:20.572 07:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:20.830 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:20.830 "name": "BaseBdev3", 00:21:20.830 "aliases": [ 00:21:20.830 "a5d6ade4-6a83-4b27-8972-b0864488892a" 00:21:20.830 ], 00:21:20.830 "product_name": "Malloc disk", 00:21:20.830 "block_size": 512, 00:21:20.830 "num_blocks": 65536, 00:21:20.831 "uuid": "a5d6ade4-6a83-4b27-8972-b0864488892a", 00:21:20.831 "assigned_rate_limits": { 00:21:20.831 "rw_ios_per_sec": 0, 00:21:20.831 "rw_mbytes_per_sec": 0, 00:21:20.831 "r_mbytes_per_sec": 0, 00:21:20.831 "w_mbytes_per_sec": 0 00:21:20.831 }, 00:21:20.831 "claimed": true, 00:21:20.831 "claim_type": "exclusive_write", 00:21:20.831 "zoned": false, 00:21:20.831 "supported_io_types": { 00:21:20.831 "read": true, 00:21:20.831 "write": true, 00:21:20.831 "unmap": true, 00:21:20.831 "flush": true, 00:21:20.831 "reset": true, 00:21:20.831 "nvme_admin": false, 00:21:20.831 "nvme_io": false, 00:21:20.831 "nvme_io_md": false, 00:21:20.831 "write_zeroes": true, 00:21:20.831 "zcopy": true, 00:21:20.831 "get_zone_info": false, 00:21:20.831 "zone_management": false, 00:21:20.831 "zone_append": false, 00:21:20.831 "compare": false, 00:21:20.831 "compare_and_write": false, 00:21:20.831 "abort": true, 00:21:20.831 "seek_hole": false, 00:21:20.831 "seek_data": false, 00:21:20.831 "copy": true, 00:21:20.831 "nvme_iov_md": false 00:21:20.831 }, 00:21:20.831 "memory_domains": [ 00:21:20.831 { 00:21:20.831 "dma_device_id": "system", 00:21:20.831 "dma_device_type": 1 00:21:20.831 }, 00:21:20.831 { 00:21:20.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.831 "dma_device_type": 2 00:21:20.831 } 00:21:20.831 ], 00:21:20.831 "driver_specific": {} 00:21:20.831 }' 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.831 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:21.089 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.348 "name": "BaseBdev4", 00:21:21.348 "aliases": [ 00:21:21.348 "c449c925-4b0f-4047-99f8-3a476c0679ff" 00:21:21.348 ], 00:21:21.348 "product_name": "Malloc disk", 00:21:21.348 "block_size": 512, 00:21:21.348 "num_blocks": 65536, 00:21:21.348 "uuid": "c449c925-4b0f-4047-99f8-3a476c0679ff", 00:21:21.348 "assigned_rate_limits": { 00:21:21.348 "rw_ios_per_sec": 0, 00:21:21.348 "rw_mbytes_per_sec": 0, 00:21:21.348 "r_mbytes_per_sec": 0, 00:21:21.348 "w_mbytes_per_sec": 0 00:21:21.348 }, 00:21:21.348 "claimed": true, 00:21:21.348 "claim_type": "exclusive_write", 00:21:21.348 "zoned": false, 00:21:21.348 "supported_io_types": { 00:21:21.348 "read": true, 00:21:21.348 "write": true, 00:21:21.348 "unmap": true, 00:21:21.348 "flush": true, 00:21:21.348 "reset": true, 00:21:21.348 "nvme_admin": false, 00:21:21.348 "nvme_io": false, 00:21:21.348 "nvme_io_md": false, 00:21:21.348 "write_zeroes": true, 00:21:21.348 "zcopy": true, 00:21:21.348 "get_zone_info": false, 00:21:21.348 "zone_management": false, 00:21:21.348 "zone_append": false, 00:21:21.348 "compare": false, 00:21:21.348 "compare_and_write": false, 00:21:21.348 "abort": true, 00:21:21.348 "seek_hole": false, 00:21:21.348 "seek_data": false, 00:21:21.348 "copy": true, 00:21:21.348 "nvme_iov_md": false 00:21:21.348 }, 00:21:21.348 "memory_domains": [ 00:21:21.348 { 00:21:21.348 "dma_device_id": "system", 00:21:21.348 "dma_device_type": 1 00:21:21.348 }, 00:21:21.348 { 00:21:21.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.348 "dma_device_type": 2 00:21:21.348 } 00:21:21.348 ], 00:21:21.348 "driver_specific": {} 00:21:21.348 }' 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.348 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.606 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.606 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.606 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.606 07:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.606 07:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.606 07:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:21.865 [2024-07-25 07:26:54.247176] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:21.865 [2024-07-25 07:26:54.247199] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:21.865 [2024-07-25 07:26:54.247250] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:21.865 [2024-07-25 07:26:54.247306] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:21.865 [2024-07-25 07:26:54.247318] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a0a30 name Existed_Raid, state offline 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1683700 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1683700 ']' 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1683700 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1683700 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1683700' 00:21:21.865 killing process with pid 1683700 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1683700 00:21:21.865 [2024-07-25 07:26:54.325756] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:21.865 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1683700 00:21:21.865 [2024-07-25 07:26:54.358843] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:22.124 07:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:22.124 00:21:22.124 real 0m31.025s 00:21:22.124 user 0m56.979s 00:21:22.124 sys 0m5.563s 00:21:22.124 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:22.124 07:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:22.124 ************************************ 00:21:22.124 END TEST raid_state_function_test_sb 00:21:22.124 ************************************ 00:21:22.124 07:26:54 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:21:22.124 07:26:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:21:22.124 07:26:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:22.124 07:26:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:22.124 ************************************ 00:21:22.124 START TEST raid_superblock_test 00:21:22.124 ************************************ 00:21:22.124 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 4 00:21:22.124 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:21:22.124 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:21:22.124 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:21:22.124 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:21:22.124 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1689645 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1689645 /var/tmp/spdk-raid.sock 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1689645 ']' 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:22.125 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:22.125 07:26:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:22.384 [2024-07-25 07:26:54.692950] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:21:22.384 [2024-07-25 07:26:54.692992] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1689645 ] 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:22.384 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:22.384 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:22.384 [2024-07-25 07:26:54.809921] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:22.384 [2024-07-25 07:26:54.897060] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:21:22.643 [2024-07-25 07:26:54.950463] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:22.643 [2024-07-25 07:26:54.950486] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:23.209 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:23.468 malloc1 00:21:23.468 07:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:23.726 [2024-07-25 07:26:56.042546] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:23.726 [2024-07-25 07:26:56.042591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:23.726 [2024-07-25 07:26:56.042609] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c64280 00:21:23.726 [2024-07-25 07:26:56.042621] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:23.726 [2024-07-25 07:26:56.044107] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:23.726 [2024-07-25 07:26:56.044135] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:23.726 pt1 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:23.726 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:23.985 malloc2 00:21:23.985 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:23.985 [2024-07-25 07:26:56.508328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:23.985 [2024-07-25 07:26:56.508370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:23.985 [2024-07-25 07:26:56.508385] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0f8c0 00:21:23.985 [2024-07-25 07:26:56.508397] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:23.985 [2024-07-25 07:26:56.509691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:23.985 [2024-07-25 07:26:56.509717] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:23.985 pt2 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:24.244 malloc3 00:21:24.244 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:24.502 [2024-07-25 07:26:56.969832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:24.502 [2024-07-25 07:26:56.969876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:24.502 [2024-07-25 07:26:56.969892] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0fef0 00:21:24.502 [2024-07-25 07:26:56.969903] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:24.502 [2024-07-25 07:26:56.971236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:24.502 [2024-07-25 07:26:56.971263] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:24.502 pt3 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:24.503 07:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:24.761 malloc4 00:21:24.761 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:25.019 [2024-07-25 07:26:57.427223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:25.019 [2024-07-25 07:26:57.427262] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.019 [2024-07-25 07:26:57.427278] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e13330 00:21:25.019 [2024-07-25 07:26:57.427290] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.019 [2024-07-25 07:26:57.428576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.019 [2024-07-25 07:26:57.428602] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:25.019 pt4 00:21:25.019 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:25.019 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:25.019 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:25.277 [2024-07-25 07:26:57.655848] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:25.277 [2024-07-25 07:26:57.656911] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:25.277 [2024-07-25 07:26:57.656975] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:25.277 [2024-07-25 07:26:57.657015] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:25.277 [2024-07-25 07:26:57.657183] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e12720 00:21:25.277 [2024-07-25 07:26:57.657194] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:25.277 [2024-07-25 07:26:57.657370] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e15e30 00:21:25.277 [2024-07-25 07:26:57.657494] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e12720 00:21:25.277 [2024-07-25 07:26:57.657503] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e12720 00:21:25.277 [2024-07-25 07:26:57.657585] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.277 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.564 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.564 "name": "raid_bdev1", 00:21:25.564 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:25.564 "strip_size_kb": 64, 00:21:25.564 "state": "online", 00:21:25.564 "raid_level": "concat", 00:21:25.564 "superblock": true, 00:21:25.564 "num_base_bdevs": 4, 00:21:25.564 "num_base_bdevs_discovered": 4, 00:21:25.564 "num_base_bdevs_operational": 4, 00:21:25.564 "base_bdevs_list": [ 00:21:25.564 { 00:21:25.564 "name": "pt1", 00:21:25.564 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:25.564 "is_configured": true, 00:21:25.564 "data_offset": 2048, 00:21:25.564 "data_size": 63488 00:21:25.564 }, 00:21:25.564 { 00:21:25.564 "name": "pt2", 00:21:25.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:25.564 "is_configured": true, 00:21:25.564 "data_offset": 2048, 00:21:25.564 "data_size": 63488 00:21:25.564 }, 00:21:25.564 { 00:21:25.564 "name": "pt3", 00:21:25.564 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:25.564 "is_configured": true, 00:21:25.564 "data_offset": 2048, 00:21:25.564 "data_size": 63488 00:21:25.564 }, 00:21:25.564 { 00:21:25.564 "name": "pt4", 00:21:25.564 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:25.564 "is_configured": true, 00:21:25.564 "data_offset": 2048, 00:21:25.564 "data_size": 63488 00:21:25.564 } 00:21:25.564 ] 00:21:25.564 }' 00:21:25.564 07:26:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.564 07:26:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:26.132 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:26.390 [2024-07-25 07:26:58.702847] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:26.390 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:26.390 "name": "raid_bdev1", 00:21:26.390 "aliases": [ 00:21:26.391 "87c43430-619b-4d57-9125-5b4922028d13" 00:21:26.391 ], 00:21:26.391 "product_name": "Raid Volume", 00:21:26.391 "block_size": 512, 00:21:26.391 "num_blocks": 253952, 00:21:26.391 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:26.391 "assigned_rate_limits": { 00:21:26.391 "rw_ios_per_sec": 0, 00:21:26.391 "rw_mbytes_per_sec": 0, 00:21:26.391 "r_mbytes_per_sec": 0, 00:21:26.391 "w_mbytes_per_sec": 0 00:21:26.391 }, 00:21:26.391 "claimed": false, 00:21:26.391 "zoned": false, 00:21:26.391 "supported_io_types": { 00:21:26.391 "read": true, 00:21:26.391 "write": true, 00:21:26.391 "unmap": true, 00:21:26.391 "flush": true, 00:21:26.391 "reset": true, 00:21:26.391 "nvme_admin": false, 00:21:26.391 "nvme_io": false, 00:21:26.391 "nvme_io_md": false, 00:21:26.391 "write_zeroes": true, 00:21:26.391 "zcopy": false, 00:21:26.391 "get_zone_info": false, 00:21:26.391 "zone_management": false, 00:21:26.391 "zone_append": false, 00:21:26.391 "compare": false, 00:21:26.391 "compare_and_write": false, 00:21:26.391 "abort": false, 00:21:26.391 "seek_hole": false, 00:21:26.391 "seek_data": false, 00:21:26.391 "copy": false, 00:21:26.391 "nvme_iov_md": false 00:21:26.391 }, 00:21:26.391 "memory_domains": [ 00:21:26.391 { 00:21:26.391 "dma_device_id": "system", 00:21:26.391 "dma_device_type": 1 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.391 "dma_device_type": 2 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "system", 00:21:26.391 "dma_device_type": 1 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.391 "dma_device_type": 2 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "system", 00:21:26.391 "dma_device_type": 1 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.391 "dma_device_type": 2 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "system", 00:21:26.391 "dma_device_type": 1 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.391 "dma_device_type": 2 00:21:26.391 } 00:21:26.391 ], 00:21:26.391 "driver_specific": { 00:21:26.391 "raid": { 00:21:26.391 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:26.391 "strip_size_kb": 64, 00:21:26.391 "state": "online", 00:21:26.391 "raid_level": "concat", 00:21:26.391 "superblock": true, 00:21:26.391 "num_base_bdevs": 4, 00:21:26.391 "num_base_bdevs_discovered": 4, 00:21:26.391 "num_base_bdevs_operational": 4, 00:21:26.391 "base_bdevs_list": [ 00:21:26.391 { 00:21:26.391 "name": "pt1", 00:21:26.391 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.391 "is_configured": true, 00:21:26.391 "data_offset": 2048, 00:21:26.391 "data_size": 63488 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "name": "pt2", 00:21:26.391 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:26.391 "is_configured": true, 00:21:26.391 "data_offset": 2048, 00:21:26.391 "data_size": 63488 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "name": "pt3", 00:21:26.391 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:26.391 "is_configured": true, 00:21:26.391 "data_offset": 2048, 00:21:26.391 "data_size": 63488 00:21:26.391 }, 00:21:26.391 { 00:21:26.391 "name": "pt4", 00:21:26.391 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:26.391 "is_configured": true, 00:21:26.391 "data_offset": 2048, 00:21:26.391 "data_size": 63488 00:21:26.391 } 00:21:26.391 ] 00:21:26.391 } 00:21:26.391 } 00:21:26.391 }' 00:21:26.391 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:26.391 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:26.391 pt2 00:21:26.391 pt3 00:21:26.391 pt4' 00:21:26.391 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.391 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:26.391 07:26:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.649 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.649 "name": "pt1", 00:21:26.649 "aliases": [ 00:21:26.649 "00000000-0000-0000-0000-000000000001" 00:21:26.649 ], 00:21:26.649 "product_name": "passthru", 00:21:26.649 "block_size": 512, 00:21:26.649 "num_blocks": 65536, 00:21:26.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.649 "assigned_rate_limits": { 00:21:26.649 "rw_ios_per_sec": 0, 00:21:26.649 "rw_mbytes_per_sec": 0, 00:21:26.649 "r_mbytes_per_sec": 0, 00:21:26.649 "w_mbytes_per_sec": 0 00:21:26.649 }, 00:21:26.649 "claimed": true, 00:21:26.649 "claim_type": "exclusive_write", 00:21:26.649 "zoned": false, 00:21:26.649 "supported_io_types": { 00:21:26.649 "read": true, 00:21:26.649 "write": true, 00:21:26.649 "unmap": true, 00:21:26.649 "flush": true, 00:21:26.649 "reset": true, 00:21:26.649 "nvme_admin": false, 00:21:26.649 "nvme_io": false, 00:21:26.649 "nvme_io_md": false, 00:21:26.649 "write_zeroes": true, 00:21:26.649 "zcopy": true, 00:21:26.649 "get_zone_info": false, 00:21:26.649 "zone_management": false, 00:21:26.649 "zone_append": false, 00:21:26.649 "compare": false, 00:21:26.649 "compare_and_write": false, 00:21:26.649 "abort": true, 00:21:26.649 "seek_hole": false, 00:21:26.649 "seek_data": false, 00:21:26.649 "copy": true, 00:21:26.650 "nvme_iov_md": false 00:21:26.650 }, 00:21:26.650 "memory_domains": [ 00:21:26.650 { 00:21:26.650 "dma_device_id": "system", 00:21:26.650 "dma_device_type": 1 00:21:26.650 }, 00:21:26.650 { 00:21:26.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.650 "dma_device_type": 2 00:21:26.650 } 00:21:26.650 ], 00:21:26.650 "driver_specific": { 00:21:26.650 "passthru": { 00:21:26.650 "name": "pt1", 00:21:26.650 "base_bdev_name": "malloc1" 00:21:26.650 } 00:21:26.650 } 00:21:26.650 }' 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.650 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:26.908 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.168 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.168 "name": "pt2", 00:21:27.168 "aliases": [ 00:21:27.168 "00000000-0000-0000-0000-000000000002" 00:21:27.168 ], 00:21:27.168 "product_name": "passthru", 00:21:27.168 "block_size": 512, 00:21:27.168 "num_blocks": 65536, 00:21:27.168 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:27.168 "assigned_rate_limits": { 00:21:27.168 "rw_ios_per_sec": 0, 00:21:27.168 "rw_mbytes_per_sec": 0, 00:21:27.168 "r_mbytes_per_sec": 0, 00:21:27.168 "w_mbytes_per_sec": 0 00:21:27.168 }, 00:21:27.168 "claimed": true, 00:21:27.168 "claim_type": "exclusive_write", 00:21:27.168 "zoned": false, 00:21:27.168 "supported_io_types": { 00:21:27.168 "read": true, 00:21:27.168 "write": true, 00:21:27.168 "unmap": true, 00:21:27.168 "flush": true, 00:21:27.168 "reset": true, 00:21:27.168 "nvme_admin": false, 00:21:27.168 "nvme_io": false, 00:21:27.168 "nvme_io_md": false, 00:21:27.168 "write_zeroes": true, 00:21:27.168 "zcopy": true, 00:21:27.168 "get_zone_info": false, 00:21:27.168 "zone_management": false, 00:21:27.168 "zone_append": false, 00:21:27.168 "compare": false, 00:21:27.168 "compare_and_write": false, 00:21:27.168 "abort": true, 00:21:27.168 "seek_hole": false, 00:21:27.168 "seek_data": false, 00:21:27.168 "copy": true, 00:21:27.168 "nvme_iov_md": false 00:21:27.168 }, 00:21:27.168 "memory_domains": [ 00:21:27.168 { 00:21:27.168 "dma_device_id": "system", 00:21:27.168 "dma_device_type": 1 00:21:27.168 }, 00:21:27.168 { 00:21:27.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.168 "dma_device_type": 2 00:21:27.168 } 00:21:27.168 ], 00:21:27.168 "driver_specific": { 00:21:27.168 "passthru": { 00:21:27.168 "name": "pt2", 00:21:27.168 "base_bdev_name": "malloc2" 00:21:27.168 } 00:21:27.168 } 00:21:27.168 }' 00:21:27.168 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.168 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.168 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.168 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:27.427 07:26:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.686 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.686 "name": "pt3", 00:21:27.686 "aliases": [ 00:21:27.686 "00000000-0000-0000-0000-000000000003" 00:21:27.686 ], 00:21:27.686 "product_name": "passthru", 00:21:27.686 "block_size": 512, 00:21:27.686 "num_blocks": 65536, 00:21:27.686 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:27.686 "assigned_rate_limits": { 00:21:27.686 "rw_ios_per_sec": 0, 00:21:27.686 "rw_mbytes_per_sec": 0, 00:21:27.686 "r_mbytes_per_sec": 0, 00:21:27.686 "w_mbytes_per_sec": 0 00:21:27.686 }, 00:21:27.686 "claimed": true, 00:21:27.686 "claim_type": "exclusive_write", 00:21:27.686 "zoned": false, 00:21:27.686 "supported_io_types": { 00:21:27.686 "read": true, 00:21:27.686 "write": true, 00:21:27.686 "unmap": true, 00:21:27.686 "flush": true, 00:21:27.686 "reset": true, 00:21:27.686 "nvme_admin": false, 00:21:27.686 "nvme_io": false, 00:21:27.686 "nvme_io_md": false, 00:21:27.686 "write_zeroes": true, 00:21:27.686 "zcopy": true, 00:21:27.686 "get_zone_info": false, 00:21:27.686 "zone_management": false, 00:21:27.686 "zone_append": false, 00:21:27.686 "compare": false, 00:21:27.686 "compare_and_write": false, 00:21:27.686 "abort": true, 00:21:27.686 "seek_hole": false, 00:21:27.686 "seek_data": false, 00:21:27.686 "copy": true, 00:21:27.686 "nvme_iov_md": false 00:21:27.686 }, 00:21:27.686 "memory_domains": [ 00:21:27.686 { 00:21:27.686 "dma_device_id": "system", 00:21:27.686 "dma_device_type": 1 00:21:27.686 }, 00:21:27.686 { 00:21:27.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.686 "dma_device_type": 2 00:21:27.686 } 00:21:27.686 ], 00:21:27.686 "driver_specific": { 00:21:27.686 "passthru": { 00:21:27.686 "name": "pt3", 00:21:27.686 "base_bdev_name": "malloc3" 00:21:27.686 } 00:21:27.686 } 00:21:27.686 }' 00:21:27.686 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.686 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.945 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:28.204 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:28.204 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:28.204 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:28.204 "name": "pt4", 00:21:28.204 "aliases": [ 00:21:28.204 "00000000-0000-0000-0000-000000000004" 00:21:28.204 ], 00:21:28.204 "product_name": "passthru", 00:21:28.204 "block_size": 512, 00:21:28.204 "num_blocks": 65536, 00:21:28.204 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:28.204 "assigned_rate_limits": { 00:21:28.204 "rw_ios_per_sec": 0, 00:21:28.204 "rw_mbytes_per_sec": 0, 00:21:28.204 "r_mbytes_per_sec": 0, 00:21:28.204 "w_mbytes_per_sec": 0 00:21:28.204 }, 00:21:28.204 "claimed": true, 00:21:28.204 "claim_type": "exclusive_write", 00:21:28.204 "zoned": false, 00:21:28.204 "supported_io_types": { 00:21:28.204 "read": true, 00:21:28.204 "write": true, 00:21:28.204 "unmap": true, 00:21:28.204 "flush": true, 00:21:28.204 "reset": true, 00:21:28.204 "nvme_admin": false, 00:21:28.204 "nvme_io": false, 00:21:28.204 "nvme_io_md": false, 00:21:28.204 "write_zeroes": true, 00:21:28.204 "zcopy": true, 00:21:28.204 "get_zone_info": false, 00:21:28.204 "zone_management": false, 00:21:28.204 "zone_append": false, 00:21:28.204 "compare": false, 00:21:28.204 "compare_and_write": false, 00:21:28.204 "abort": true, 00:21:28.204 "seek_hole": false, 00:21:28.204 "seek_data": false, 00:21:28.204 "copy": true, 00:21:28.204 "nvme_iov_md": false 00:21:28.204 }, 00:21:28.204 "memory_domains": [ 00:21:28.204 { 00:21:28.204 "dma_device_id": "system", 00:21:28.204 "dma_device_type": 1 00:21:28.204 }, 00:21:28.204 { 00:21:28.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:28.204 "dma_device_type": 2 00:21:28.204 } 00:21:28.204 ], 00:21:28.204 "driver_specific": { 00:21:28.204 "passthru": { 00:21:28.204 "name": "pt4", 00:21:28.204 "base_bdev_name": "malloc4" 00:21:28.204 } 00:21:28.204 } 00:21:28.204 }' 00:21:28.204 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:28.463 07:27:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:21:28.722 [2024-07-25 07:27:01.205434] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=87c43430-619b-4d57-9125-5b4922028d13 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 87c43430-619b-4d57-9125-5b4922028d13 ']' 00:21:28.722 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:28.981 [2024-07-25 07:27:01.437749] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:28.981 [2024-07-25 07:27:01.437767] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:28.981 [2024-07-25 07:27:01.437813] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:28.981 [2024-07-25 07:27:01.437870] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:28.981 [2024-07-25 07:27:01.437880] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e12720 name raid_bdev1, state offline 00:21:28.981 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:21:28.981 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.240 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:21:29.240 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:21:29.240 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:29.240 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:29.499 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:29.499 07:27:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:30.067 07:27:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:30.067 07:27:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:30.325 07:27:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:30.325 07:27:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:30.584 07:27:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:30.584 07:27:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:30.584 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:30.843 [2024-07-25 07:27:03.302577] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:30.843 [2024-07-25 07:27:03.303842] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:30.843 [2024-07-25 07:27:03.303884] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:30.843 [2024-07-25 07:27:03.303916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:30.843 [2024-07-25 07:27:03.303956] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:30.843 [2024-07-25 07:27:03.303994] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:30.843 [2024-07-25 07:27:03.304015] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:30.843 [2024-07-25 07:27:03.304036] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:30.843 [2024-07-25 07:27:03.304052] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:30.843 [2024-07-25 07:27:03.304062] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e15b00 name raid_bdev1, state configuring 00:21:30.843 request: 00:21:30.843 { 00:21:30.843 "name": "raid_bdev1", 00:21:30.843 "raid_level": "concat", 00:21:30.843 "base_bdevs": [ 00:21:30.843 "malloc1", 00:21:30.843 "malloc2", 00:21:30.843 "malloc3", 00:21:30.843 "malloc4" 00:21:30.843 ], 00:21:30.843 "strip_size_kb": 64, 00:21:30.843 "superblock": false, 00:21:30.843 "method": "bdev_raid_create", 00:21:30.843 "req_id": 1 00:21:30.843 } 00:21:30.843 Got JSON-RPC error response 00:21:30.843 response: 00:21:30.843 { 00:21:30.843 "code": -17, 00:21:30.843 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:30.843 } 00:21:30.843 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:21:30.843 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:30.844 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:30.844 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:30.844 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.844 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:21:31.102 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:21:31.102 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:21:31.102 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:31.367 [2024-07-25 07:27:03.743677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:31.367 [2024-07-25 07:27:03.743721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.367 [2024-07-25 07:27:03.743741] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0eea0 00:21:31.367 [2024-07-25 07:27:03.743753] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.367 [2024-07-25 07:27:03.745235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.367 [2024-07-25 07:27:03.745262] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:31.367 [2024-07-25 07:27:03.745331] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:31.367 [2024-07-25 07:27:03.745357] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:31.367 pt1 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.367 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.625 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.625 "name": "raid_bdev1", 00:21:31.625 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:31.625 "strip_size_kb": 64, 00:21:31.625 "state": "configuring", 00:21:31.625 "raid_level": "concat", 00:21:31.625 "superblock": true, 00:21:31.625 "num_base_bdevs": 4, 00:21:31.625 "num_base_bdevs_discovered": 1, 00:21:31.625 "num_base_bdevs_operational": 4, 00:21:31.625 "base_bdevs_list": [ 00:21:31.625 { 00:21:31.625 "name": "pt1", 00:21:31.625 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:31.625 "is_configured": true, 00:21:31.625 "data_offset": 2048, 00:21:31.625 "data_size": 63488 00:21:31.625 }, 00:21:31.625 { 00:21:31.625 "name": null, 00:21:31.625 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:31.625 "is_configured": false, 00:21:31.625 "data_offset": 2048, 00:21:31.625 "data_size": 63488 00:21:31.625 }, 00:21:31.625 { 00:21:31.625 "name": null, 00:21:31.625 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:31.625 "is_configured": false, 00:21:31.625 "data_offset": 2048, 00:21:31.625 "data_size": 63488 00:21:31.625 }, 00:21:31.625 { 00:21:31.625 "name": null, 00:21:31.625 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:31.625 "is_configured": false, 00:21:31.625 "data_offset": 2048, 00:21:31.625 "data_size": 63488 00:21:31.625 } 00:21:31.625 ] 00:21:31.625 }' 00:21:31.626 07:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.626 07:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.193 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:21:32.193 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:32.451 [2024-07-25 07:27:04.750335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:32.451 [2024-07-25 07:27:04.750387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:32.451 [2024-07-25 07:27:04.750405] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0e2b0 00:21:32.451 [2024-07-25 07:27:04.750416] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:32.451 [2024-07-25 07:27:04.750739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:32.451 [2024-07-25 07:27:04.750756] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:32.451 [2024-07-25 07:27:04.750814] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:32.451 [2024-07-25 07:27:04.750831] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:32.451 pt2 00:21:32.451 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:32.451 [2024-07-25 07:27:04.978939] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:32.709 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:32.709 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.709 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:32.709 07:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.709 "name": "raid_bdev1", 00:21:32.709 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:32.709 "strip_size_kb": 64, 00:21:32.709 "state": "configuring", 00:21:32.709 "raid_level": "concat", 00:21:32.709 "superblock": true, 00:21:32.709 "num_base_bdevs": 4, 00:21:32.709 "num_base_bdevs_discovered": 1, 00:21:32.709 "num_base_bdevs_operational": 4, 00:21:32.709 "base_bdevs_list": [ 00:21:32.709 { 00:21:32.709 "name": "pt1", 00:21:32.709 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:32.709 "is_configured": true, 00:21:32.709 "data_offset": 2048, 00:21:32.709 "data_size": 63488 00:21:32.709 }, 00:21:32.709 { 00:21:32.709 "name": null, 00:21:32.709 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:32.709 "is_configured": false, 00:21:32.709 "data_offset": 2048, 00:21:32.709 "data_size": 63488 00:21:32.709 }, 00:21:32.709 { 00:21:32.709 "name": null, 00:21:32.709 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:32.709 "is_configured": false, 00:21:32.709 "data_offset": 2048, 00:21:32.709 "data_size": 63488 00:21:32.709 }, 00:21:32.709 { 00:21:32.709 "name": null, 00:21:32.709 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:32.709 "is_configured": false, 00:21:32.709 "data_offset": 2048, 00:21:32.709 "data_size": 63488 00:21:32.709 } 00:21:32.709 ] 00:21:32.709 }' 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.709 07:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:33.645 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:21:33.645 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:33.645 07:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:33.645 [2024-07-25 07:27:06.009668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:33.645 [2024-07-25 07:27:06.009717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:33.645 [2024-07-25 07:27:06.009734] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c644b0 00:21:33.645 [2024-07-25 07:27:06.009746] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:33.645 [2024-07-25 07:27:06.010063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:33.645 [2024-07-25 07:27:06.010079] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:33.645 [2024-07-25 07:27:06.010148] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:33.645 [2024-07-25 07:27:06.010167] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:33.645 pt2 00:21:33.645 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:33.645 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:33.645 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:33.903 [2024-07-25 07:27:06.234263] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:33.903 [2024-07-25 07:27:06.234299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:33.903 [2024-07-25 07:27:06.234314] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e12380 00:21:33.903 [2024-07-25 07:27:06.234325] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:33.903 [2024-07-25 07:27:06.234611] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:33.903 [2024-07-25 07:27:06.234627] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:33.903 [2024-07-25 07:27:06.234678] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:33.903 [2024-07-25 07:27:06.234696] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:33.903 pt3 00:21:33.903 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:33.903 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:33.903 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:34.162 [2024-07-25 07:27:06.446823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:34.162 [2024-07-25 07:27:06.446853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.162 [2024-07-25 07:27:06.446869] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e13f00 00:21:34.162 [2024-07-25 07:27:06.446879] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.162 [2024-07-25 07:27:06.447149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.162 [2024-07-25 07:27:06.447165] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:34.162 [2024-07-25 07:27:06.447213] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:34.162 [2024-07-25 07:27:06.447229] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:34.162 [2024-07-25 07:27:06.447335] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e0e740 00:21:34.162 [2024-07-25 07:27:06.447345] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:34.162 [2024-07-25 07:27:06.447498] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c63710 00:21:34.162 [2024-07-25 07:27:06.447613] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e0e740 00:21:34.162 [2024-07-25 07:27:06.447622] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e0e740 00:21:34.162 [2024-07-25 07:27:06.447708] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:34.162 pt4 00:21:34.162 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:34.162 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:34.162 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:34.162 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:34.162 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.163 "name": "raid_bdev1", 00:21:34.163 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:34.163 "strip_size_kb": 64, 00:21:34.163 "state": "online", 00:21:34.163 "raid_level": "concat", 00:21:34.163 "superblock": true, 00:21:34.163 "num_base_bdevs": 4, 00:21:34.163 "num_base_bdevs_discovered": 4, 00:21:34.163 "num_base_bdevs_operational": 4, 00:21:34.163 "base_bdevs_list": [ 00:21:34.163 { 00:21:34.163 "name": "pt1", 00:21:34.163 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:34.163 "is_configured": true, 00:21:34.163 "data_offset": 2048, 00:21:34.163 "data_size": 63488 00:21:34.163 }, 00:21:34.163 { 00:21:34.163 "name": "pt2", 00:21:34.163 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:34.163 "is_configured": true, 00:21:34.163 "data_offset": 2048, 00:21:34.163 "data_size": 63488 00:21:34.163 }, 00:21:34.163 { 00:21:34.163 "name": "pt3", 00:21:34.163 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:34.163 "is_configured": true, 00:21:34.163 "data_offset": 2048, 00:21:34.163 "data_size": 63488 00:21:34.163 }, 00:21:34.163 { 00:21:34.163 "name": "pt4", 00:21:34.163 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:34.163 "is_configured": true, 00:21:34.163 "data_offset": 2048, 00:21:34.163 "data_size": 63488 00:21:34.163 } 00:21:34.163 ] 00:21:34.163 }' 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.163 07:27:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:34.733 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:34.992 [2024-07-25 07:27:07.473840] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:34.992 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:34.992 "name": "raid_bdev1", 00:21:34.992 "aliases": [ 00:21:34.992 "87c43430-619b-4d57-9125-5b4922028d13" 00:21:34.992 ], 00:21:34.992 "product_name": "Raid Volume", 00:21:34.992 "block_size": 512, 00:21:34.992 "num_blocks": 253952, 00:21:34.992 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:34.992 "assigned_rate_limits": { 00:21:34.992 "rw_ios_per_sec": 0, 00:21:34.992 "rw_mbytes_per_sec": 0, 00:21:34.992 "r_mbytes_per_sec": 0, 00:21:34.992 "w_mbytes_per_sec": 0 00:21:34.992 }, 00:21:34.992 "claimed": false, 00:21:34.992 "zoned": false, 00:21:34.992 "supported_io_types": { 00:21:34.992 "read": true, 00:21:34.992 "write": true, 00:21:34.992 "unmap": true, 00:21:34.992 "flush": true, 00:21:34.992 "reset": true, 00:21:34.992 "nvme_admin": false, 00:21:34.992 "nvme_io": false, 00:21:34.992 "nvme_io_md": false, 00:21:34.992 "write_zeroes": true, 00:21:34.992 "zcopy": false, 00:21:34.992 "get_zone_info": false, 00:21:34.992 "zone_management": false, 00:21:34.992 "zone_append": false, 00:21:34.992 "compare": false, 00:21:34.992 "compare_and_write": false, 00:21:34.992 "abort": false, 00:21:34.992 "seek_hole": false, 00:21:34.992 "seek_data": false, 00:21:34.992 "copy": false, 00:21:34.992 "nvme_iov_md": false 00:21:34.992 }, 00:21:34.992 "memory_domains": [ 00:21:34.992 { 00:21:34.992 "dma_device_id": "system", 00:21:34.992 "dma_device_type": 1 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.992 "dma_device_type": 2 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "system", 00:21:34.992 "dma_device_type": 1 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.992 "dma_device_type": 2 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "system", 00:21:34.992 "dma_device_type": 1 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.992 "dma_device_type": 2 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "system", 00:21:34.992 "dma_device_type": 1 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.992 "dma_device_type": 2 00:21:34.992 } 00:21:34.992 ], 00:21:34.992 "driver_specific": { 00:21:34.992 "raid": { 00:21:34.992 "uuid": "87c43430-619b-4d57-9125-5b4922028d13", 00:21:34.992 "strip_size_kb": 64, 00:21:34.992 "state": "online", 00:21:34.992 "raid_level": "concat", 00:21:34.992 "superblock": true, 00:21:34.992 "num_base_bdevs": 4, 00:21:34.992 "num_base_bdevs_discovered": 4, 00:21:34.992 "num_base_bdevs_operational": 4, 00:21:34.992 "base_bdevs_list": [ 00:21:34.992 { 00:21:34.992 "name": "pt1", 00:21:34.992 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:34.992 "is_configured": true, 00:21:34.992 "data_offset": 2048, 00:21:34.992 "data_size": 63488 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "name": "pt2", 00:21:34.992 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:34.992 "is_configured": true, 00:21:34.992 "data_offset": 2048, 00:21:34.992 "data_size": 63488 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "name": "pt3", 00:21:34.992 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:34.992 "is_configured": true, 00:21:34.992 "data_offset": 2048, 00:21:34.992 "data_size": 63488 00:21:34.992 }, 00:21:34.992 { 00:21:34.992 "name": "pt4", 00:21:34.992 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:34.992 "is_configured": true, 00:21:34.992 "data_offset": 2048, 00:21:34.992 "data_size": 63488 00:21:34.992 } 00:21:34.992 ] 00:21:34.992 } 00:21:34.992 } 00:21:34.992 }' 00:21:34.992 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:35.250 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:35.250 pt2 00:21:35.250 pt3 00:21:35.250 pt4' 00:21:35.250 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:35.250 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:35.250 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.250 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.250 "name": "pt1", 00:21:35.250 "aliases": [ 00:21:35.250 "00000000-0000-0000-0000-000000000001" 00:21:35.250 ], 00:21:35.250 "product_name": "passthru", 00:21:35.250 "block_size": 512, 00:21:35.250 "num_blocks": 65536, 00:21:35.250 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:35.250 "assigned_rate_limits": { 00:21:35.250 "rw_ios_per_sec": 0, 00:21:35.250 "rw_mbytes_per_sec": 0, 00:21:35.250 "r_mbytes_per_sec": 0, 00:21:35.250 "w_mbytes_per_sec": 0 00:21:35.250 }, 00:21:35.250 "claimed": true, 00:21:35.250 "claim_type": "exclusive_write", 00:21:35.250 "zoned": false, 00:21:35.250 "supported_io_types": { 00:21:35.250 "read": true, 00:21:35.250 "write": true, 00:21:35.250 "unmap": true, 00:21:35.250 "flush": true, 00:21:35.250 "reset": true, 00:21:35.250 "nvme_admin": false, 00:21:35.250 "nvme_io": false, 00:21:35.250 "nvme_io_md": false, 00:21:35.250 "write_zeroes": true, 00:21:35.250 "zcopy": true, 00:21:35.250 "get_zone_info": false, 00:21:35.250 "zone_management": false, 00:21:35.250 "zone_append": false, 00:21:35.250 "compare": false, 00:21:35.250 "compare_and_write": false, 00:21:35.250 "abort": true, 00:21:35.250 "seek_hole": false, 00:21:35.250 "seek_data": false, 00:21:35.250 "copy": true, 00:21:35.250 "nvme_iov_md": false 00:21:35.250 }, 00:21:35.250 "memory_domains": [ 00:21:35.250 { 00:21:35.250 "dma_device_id": "system", 00:21:35.250 "dma_device_type": 1 00:21:35.250 }, 00:21:35.250 { 00:21:35.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.250 "dma_device_type": 2 00:21:35.250 } 00:21:35.250 ], 00:21:35.250 "driver_specific": { 00:21:35.250 "passthru": { 00:21:35.250 "name": "pt1", 00:21:35.250 "base_bdev_name": "malloc1" 00:21:35.250 } 00:21:35.250 } 00:21:35.250 }' 00:21:35.250 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.509 07:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.509 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:35.509 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.767 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.767 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:35.767 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:35.767 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:35.767 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:36.027 "name": "pt2", 00:21:36.027 "aliases": [ 00:21:36.027 "00000000-0000-0000-0000-000000000002" 00:21:36.027 ], 00:21:36.027 "product_name": "passthru", 00:21:36.027 "block_size": 512, 00:21:36.027 "num_blocks": 65536, 00:21:36.027 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:36.027 "assigned_rate_limits": { 00:21:36.027 "rw_ios_per_sec": 0, 00:21:36.027 "rw_mbytes_per_sec": 0, 00:21:36.027 "r_mbytes_per_sec": 0, 00:21:36.027 "w_mbytes_per_sec": 0 00:21:36.027 }, 00:21:36.027 "claimed": true, 00:21:36.027 "claim_type": "exclusive_write", 00:21:36.027 "zoned": false, 00:21:36.027 "supported_io_types": { 00:21:36.027 "read": true, 00:21:36.027 "write": true, 00:21:36.027 "unmap": true, 00:21:36.027 "flush": true, 00:21:36.027 "reset": true, 00:21:36.027 "nvme_admin": false, 00:21:36.027 "nvme_io": false, 00:21:36.027 "nvme_io_md": false, 00:21:36.027 "write_zeroes": true, 00:21:36.027 "zcopy": true, 00:21:36.027 "get_zone_info": false, 00:21:36.027 "zone_management": false, 00:21:36.027 "zone_append": false, 00:21:36.027 "compare": false, 00:21:36.027 "compare_and_write": false, 00:21:36.027 "abort": true, 00:21:36.027 "seek_hole": false, 00:21:36.027 "seek_data": false, 00:21:36.027 "copy": true, 00:21:36.027 "nvme_iov_md": false 00:21:36.027 }, 00:21:36.027 "memory_domains": [ 00:21:36.027 { 00:21:36.027 "dma_device_id": "system", 00:21:36.027 "dma_device_type": 1 00:21:36.027 }, 00:21:36.027 { 00:21:36.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.027 "dma_device_type": 2 00:21:36.027 } 00:21:36.027 ], 00:21:36.027 "driver_specific": { 00:21:36.027 "passthru": { 00:21:36.027 "name": "pt2", 00:21:36.027 "base_bdev_name": "malloc2" 00:21:36.027 } 00:21:36.027 } 00:21:36.027 }' 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:36.027 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:36.286 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:36.545 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:36.545 "name": "pt3", 00:21:36.545 "aliases": [ 00:21:36.545 "00000000-0000-0000-0000-000000000003" 00:21:36.545 ], 00:21:36.545 "product_name": "passthru", 00:21:36.545 "block_size": 512, 00:21:36.545 "num_blocks": 65536, 00:21:36.545 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:36.545 "assigned_rate_limits": { 00:21:36.545 "rw_ios_per_sec": 0, 00:21:36.545 "rw_mbytes_per_sec": 0, 00:21:36.545 "r_mbytes_per_sec": 0, 00:21:36.545 "w_mbytes_per_sec": 0 00:21:36.545 }, 00:21:36.545 "claimed": true, 00:21:36.545 "claim_type": "exclusive_write", 00:21:36.545 "zoned": false, 00:21:36.545 "supported_io_types": { 00:21:36.545 "read": true, 00:21:36.545 "write": true, 00:21:36.545 "unmap": true, 00:21:36.545 "flush": true, 00:21:36.545 "reset": true, 00:21:36.545 "nvme_admin": false, 00:21:36.545 "nvme_io": false, 00:21:36.545 "nvme_io_md": false, 00:21:36.545 "write_zeroes": true, 00:21:36.545 "zcopy": true, 00:21:36.545 "get_zone_info": false, 00:21:36.545 "zone_management": false, 00:21:36.545 "zone_append": false, 00:21:36.545 "compare": false, 00:21:36.545 "compare_and_write": false, 00:21:36.545 "abort": true, 00:21:36.545 "seek_hole": false, 00:21:36.545 "seek_data": false, 00:21:36.545 "copy": true, 00:21:36.545 "nvme_iov_md": false 00:21:36.545 }, 00:21:36.545 "memory_domains": [ 00:21:36.545 { 00:21:36.545 "dma_device_id": "system", 00:21:36.545 "dma_device_type": 1 00:21:36.545 }, 00:21:36.545 { 00:21:36.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.545 "dma_device_type": 2 00:21:36.545 } 00:21:36.545 ], 00:21:36.545 "driver_specific": { 00:21:36.545 "passthru": { 00:21:36.545 "name": "pt3", 00:21:36.545 "base_bdev_name": "malloc3" 00:21:36.545 } 00:21:36.545 } 00:21:36.545 }' 00:21:36.545 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.545 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.545 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:36.545 07:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.545 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:36.804 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:37.063 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:37.063 "name": "pt4", 00:21:37.063 "aliases": [ 00:21:37.063 "00000000-0000-0000-0000-000000000004" 00:21:37.063 ], 00:21:37.063 "product_name": "passthru", 00:21:37.063 "block_size": 512, 00:21:37.063 "num_blocks": 65536, 00:21:37.063 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:37.063 "assigned_rate_limits": { 00:21:37.063 "rw_ios_per_sec": 0, 00:21:37.063 "rw_mbytes_per_sec": 0, 00:21:37.063 "r_mbytes_per_sec": 0, 00:21:37.063 "w_mbytes_per_sec": 0 00:21:37.063 }, 00:21:37.063 "claimed": true, 00:21:37.063 "claim_type": "exclusive_write", 00:21:37.063 "zoned": false, 00:21:37.063 "supported_io_types": { 00:21:37.063 "read": true, 00:21:37.063 "write": true, 00:21:37.063 "unmap": true, 00:21:37.063 "flush": true, 00:21:37.063 "reset": true, 00:21:37.063 "nvme_admin": false, 00:21:37.063 "nvme_io": false, 00:21:37.063 "nvme_io_md": false, 00:21:37.063 "write_zeroes": true, 00:21:37.063 "zcopy": true, 00:21:37.063 "get_zone_info": false, 00:21:37.063 "zone_management": false, 00:21:37.063 "zone_append": false, 00:21:37.063 "compare": false, 00:21:37.063 "compare_and_write": false, 00:21:37.063 "abort": true, 00:21:37.063 "seek_hole": false, 00:21:37.063 "seek_data": false, 00:21:37.063 "copy": true, 00:21:37.063 "nvme_iov_md": false 00:21:37.063 }, 00:21:37.063 "memory_domains": [ 00:21:37.063 { 00:21:37.063 "dma_device_id": "system", 00:21:37.063 "dma_device_type": 1 00:21:37.063 }, 00:21:37.063 { 00:21:37.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.063 "dma_device_type": 2 00:21:37.063 } 00:21:37.063 ], 00:21:37.063 "driver_specific": { 00:21:37.063 "passthru": { 00:21:37.063 "name": "pt4", 00:21:37.063 "base_bdev_name": "malloc4" 00:21:37.063 } 00:21:37.063 } 00:21:37.063 }' 00:21:37.063 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:37.063 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:37.063 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:37.063 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:37.322 07:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:21:37.579 [2024-07-25 07:27:10.044607] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 87c43430-619b-4d57-9125-5b4922028d13 '!=' 87c43430-619b-4d57-9125-5b4922028d13 ']' 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1689645 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1689645 ']' 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1689645 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:37.579 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1689645 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1689645' 00:21:37.838 killing process with pid 1689645 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1689645 00:21:37.838 [2024-07-25 07:27:10.122872] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:37.838 [2024-07-25 07:27:10.122936] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:37.838 [2024-07-25 07:27:10.122996] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:37.838 [2024-07-25 07:27:10.123007] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e0e740 name raid_bdev1, state offline 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1689645 00:21:37.838 [2024-07-25 07:27:10.155869] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:21:37.838 00:21:37.838 real 0m15.709s 00:21:37.838 user 0m28.388s 00:21:37.838 sys 0m2.784s 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:37.838 07:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.838 ************************************ 00:21:37.838 END TEST raid_superblock_test 00:21:37.838 ************************************ 00:21:38.097 07:27:10 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:21:38.097 07:27:10 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:38.097 07:27:10 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:38.097 07:27:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:38.097 ************************************ 00:21:38.097 START TEST raid_read_error_test 00:21:38.097 ************************************ 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 read 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.nlP0S4hqF7 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1693151 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1693151 /var/tmp/spdk-raid.sock 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1693151 ']' 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:38.097 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:38.097 07:27:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:38.097 [2024-07-25 07:27:10.507405] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:21:38.097 [2024-07-25 07:27:10.507462] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1693151 ] 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:38.097 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.097 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:38.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.098 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:38.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.098 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:38.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.098 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:38.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.098 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:38.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.098 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:38.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:38.098 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:38.372 [2024-07-25 07:27:10.639186] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:38.372 [2024-07-25 07:27:10.726071] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:21:38.372 [2024-07-25 07:27:10.789511] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:38.372 [2024-07-25 07:27:10.789546] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:38.950 07:27:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:38.950 07:27:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:38.950 07:27:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:38.950 07:27:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:39.208 BaseBdev1_malloc 00:21:39.208 07:27:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:39.466 true 00:21:39.466 07:27:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:39.725 [2024-07-25 07:27:12.059850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:39.725 [2024-07-25 07:27:12.059889] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.725 [2024-07-25 07:27:12.059906] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd93a50 00:21:39.725 [2024-07-25 07:27:12.059918] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.725 [2024-07-25 07:27:12.061420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.725 [2024-07-25 07:27:12.061447] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:39.725 BaseBdev1 00:21:39.725 07:27:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:39.725 07:27:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:39.983 BaseBdev2_malloc 00:21:39.983 07:27:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:40.242 true 00:21:40.242 07:27:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:40.242 [2024-07-25 07:27:12.762135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:40.242 [2024-07-25 07:27:12.762176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:40.242 [2024-07-25 07:27:12.762195] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3cf40 00:21:40.242 [2024-07-25 07:27:12.762206] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:40.242 [2024-07-25 07:27:12.763579] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:40.242 [2024-07-25 07:27:12.763605] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:40.242 BaseBdev2 00:21:40.501 07:27:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:40.501 07:27:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:40.501 BaseBdev3_malloc 00:21:40.501 07:27:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:40.760 true 00:21:40.760 07:27:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:41.018 [2024-07-25 07:27:13.448178] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:41.018 [2024-07-25 07:27:13.448216] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.018 [2024-07-25 07:27:13.448234] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf40250 00:21:41.018 [2024-07-25 07:27:13.448245] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.018 [2024-07-25 07:27:13.449636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.018 [2024-07-25 07:27:13.449663] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:41.018 BaseBdev3 00:21:41.018 07:27:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:41.018 07:27:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:41.277 BaseBdev4_malloc 00:21:41.277 07:27:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:41.535 true 00:21:41.535 07:27:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:41.793 [2024-07-25 07:27:14.154388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:41.793 [2024-07-25 07:27:14.154426] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.793 [2024-07-25 07:27:14.154446] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf41b40 00:21:41.794 [2024-07-25 07:27:14.154457] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.794 [2024-07-25 07:27:14.155860] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.794 [2024-07-25 07:27:14.155886] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:41.794 BaseBdev4 00:21:41.794 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:42.052 [2024-07-25 07:27:14.379008] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:42.052 [2024-07-25 07:27:14.380189] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:42.052 [2024-07-25 07:27:14.380252] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:42.052 [2024-07-25 07:27:14.380305] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:42.052 [2024-07-25 07:27:14.380530] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3e5d0 00:21:42.052 [2024-07-25 07:27:14.380541] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:42.052 [2024-07-25 07:27:14.380719] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd8fc70 00:21:42.052 [2024-07-25 07:27:14.380855] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3e5d0 00:21:42.052 [2024-07-25 07:27:14.380864] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf3e5d0 00:21:42.052 [2024-07-25 07:27:14.380958] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.052 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.311 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.311 "name": "raid_bdev1", 00:21:42.311 "uuid": "77349023-a9d0-4052-a869-76467fae4abe", 00:21:42.311 "strip_size_kb": 64, 00:21:42.311 "state": "online", 00:21:42.311 "raid_level": "concat", 00:21:42.311 "superblock": true, 00:21:42.311 "num_base_bdevs": 4, 00:21:42.311 "num_base_bdevs_discovered": 4, 00:21:42.311 "num_base_bdevs_operational": 4, 00:21:42.311 "base_bdevs_list": [ 00:21:42.311 { 00:21:42.311 "name": "BaseBdev1", 00:21:42.311 "uuid": "0a37200a-c95d-5ce8-b830-53695e08bcbe", 00:21:42.311 "is_configured": true, 00:21:42.311 "data_offset": 2048, 00:21:42.311 "data_size": 63488 00:21:42.311 }, 00:21:42.311 { 00:21:42.311 "name": "BaseBdev2", 00:21:42.311 "uuid": "e4f3991c-ad96-5872-bd70-7364c9dd4a48", 00:21:42.311 "is_configured": true, 00:21:42.311 "data_offset": 2048, 00:21:42.311 "data_size": 63488 00:21:42.311 }, 00:21:42.311 { 00:21:42.311 "name": "BaseBdev3", 00:21:42.311 "uuid": "93235136-7685-57dd-b9fb-2b5e56a68884", 00:21:42.311 "is_configured": true, 00:21:42.311 "data_offset": 2048, 00:21:42.311 "data_size": 63488 00:21:42.311 }, 00:21:42.311 { 00:21:42.311 "name": "BaseBdev4", 00:21:42.311 "uuid": "9e621845-51f9-57fc-98b1-7c5de6319ec2", 00:21:42.311 "is_configured": true, 00:21:42.311 "data_offset": 2048, 00:21:42.311 "data_size": 63488 00:21:42.311 } 00:21:42.311 ] 00:21:42.311 }' 00:21:42.311 07:27:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.311 07:27:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.878 07:27:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:42.878 07:27:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:42.878 [2024-07-25 07:27:15.285781] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe32f40 00:21:43.812 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.071 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.329 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.329 "name": "raid_bdev1", 00:21:44.329 "uuid": "77349023-a9d0-4052-a869-76467fae4abe", 00:21:44.329 "strip_size_kb": 64, 00:21:44.329 "state": "online", 00:21:44.329 "raid_level": "concat", 00:21:44.329 "superblock": true, 00:21:44.329 "num_base_bdevs": 4, 00:21:44.329 "num_base_bdevs_discovered": 4, 00:21:44.329 "num_base_bdevs_operational": 4, 00:21:44.329 "base_bdevs_list": [ 00:21:44.329 { 00:21:44.329 "name": "BaseBdev1", 00:21:44.329 "uuid": "0a37200a-c95d-5ce8-b830-53695e08bcbe", 00:21:44.329 "is_configured": true, 00:21:44.329 "data_offset": 2048, 00:21:44.329 "data_size": 63488 00:21:44.329 }, 00:21:44.329 { 00:21:44.329 "name": "BaseBdev2", 00:21:44.329 "uuid": "e4f3991c-ad96-5872-bd70-7364c9dd4a48", 00:21:44.329 "is_configured": true, 00:21:44.329 "data_offset": 2048, 00:21:44.329 "data_size": 63488 00:21:44.329 }, 00:21:44.329 { 00:21:44.329 "name": "BaseBdev3", 00:21:44.329 "uuid": "93235136-7685-57dd-b9fb-2b5e56a68884", 00:21:44.329 "is_configured": true, 00:21:44.329 "data_offset": 2048, 00:21:44.329 "data_size": 63488 00:21:44.329 }, 00:21:44.329 { 00:21:44.329 "name": "BaseBdev4", 00:21:44.329 "uuid": "9e621845-51f9-57fc-98b1-7c5de6319ec2", 00:21:44.329 "is_configured": true, 00:21:44.329 "data_offset": 2048, 00:21:44.329 "data_size": 63488 00:21:44.329 } 00:21:44.329 ] 00:21:44.329 }' 00:21:44.329 07:27:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.329 07:27:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:44.896 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:45.155 [2024-07-25 07:27:17.445248] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:45.155 [2024-07-25 07:27:17.445278] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:45.155 [2024-07-25 07:27:17.448183] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:45.155 [2024-07-25 07:27:17.448220] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:45.155 [2024-07-25 07:27:17.448255] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:45.155 [2024-07-25 07:27:17.448265] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3e5d0 name raid_bdev1, state offline 00:21:45.155 0 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1693151 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1693151 ']' 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1693151 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1693151 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1693151' 00:21:45.155 killing process with pid 1693151 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1693151 00:21:45.155 [2024-07-25 07:27:17.523167] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:45.155 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1693151 00:21:45.155 [2024-07-25 07:27:17.550567] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.nlP0S4hqF7 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:21:45.414 00:21:45.414 real 0m7.325s 00:21:45.414 user 0m11.665s 00:21:45.414 sys 0m1.287s 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:45.414 07:27:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:45.414 ************************************ 00:21:45.414 END TEST raid_read_error_test 00:21:45.414 ************************************ 00:21:45.414 07:27:17 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:45.414 07:27:17 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:45.414 07:27:17 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:45.414 07:27:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:45.414 ************************************ 00:21:45.414 START TEST raid_write_error_test 00:21:45.414 ************************************ 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 write 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.6dRrMpIrWR 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1694345 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1694345 /var/tmp/spdk-raid.sock 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1694345 ']' 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:45.414 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:45.414 07:27:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:45.414 [2024-07-25 07:27:17.902714] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:21:45.414 [2024-07-25 07:27:17.902771] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1694345 ] 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:45.674 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:45.674 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:45.674 [2024-07-25 07:27:18.034729] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:45.674 [2024-07-25 07:27:18.118164] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:21:45.674 [2024-07-25 07:27:18.183181] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:45.674 [2024-07-25 07:27:18.183218] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:46.608 07:27:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:46.608 07:27:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:46.608 07:27:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:46.608 07:27:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:46.608 BaseBdev1_malloc 00:21:46.608 07:27:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:46.865 true 00:21:46.865 07:27:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:47.123 [2024-07-25 07:27:19.469844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:47.123 [2024-07-25 07:27:19.469889] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.123 [2024-07-25 07:27:19.469908] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x243ca50 00:21:47.123 [2024-07-25 07:27:19.469919] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.123 [2024-07-25 07:27:19.471349] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.123 [2024-07-25 07:27:19.471377] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:47.123 BaseBdev1 00:21:47.123 07:27:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:47.123 07:27:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:47.382 BaseBdev2_malloc 00:21:47.382 07:27:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:47.640 true 00:21:47.640 07:27:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:47.640 [2024-07-25 07:27:20.148005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:47.640 [2024-07-25 07:27:20.148055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.640 [2024-07-25 07:27:20.148074] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e5f40 00:21:47.640 [2024-07-25 07:27:20.148086] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.640 [2024-07-25 07:27:20.149474] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.640 [2024-07-25 07:27:20.149501] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:47.640 BaseBdev2 00:21:47.640 07:27:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:47.640 07:27:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:47.899 BaseBdev3_malloc 00:21:47.899 07:27:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:48.156 true 00:21:48.156 07:27:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:48.415 [2024-07-25 07:27:20.837962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:48.415 [2024-07-25 07:27:20.837998] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:48.415 [2024-07-25 07:27:20.838015] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e9250 00:21:48.415 [2024-07-25 07:27:20.838027] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:48.415 [2024-07-25 07:27:20.839281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:48.415 [2024-07-25 07:27:20.839306] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:48.415 BaseBdev3 00:21:48.415 07:27:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:48.415 07:27:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:48.673 BaseBdev4_malloc 00:21:48.673 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:48.931 true 00:21:48.931 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:49.189 [2024-07-25 07:27:21.519974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:49.189 [2024-07-25 07:27:21.520017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:49.189 [2024-07-25 07:27:21.520037] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25eab40 00:21:49.189 [2024-07-25 07:27:21.520049] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:49.189 [2024-07-25 07:27:21.521374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:49.189 [2024-07-25 07:27:21.521403] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:49.189 BaseBdev4 00:21:49.189 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:49.520 [2024-07-25 07:27:21.744600] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:49.520 [2024-07-25 07:27:21.745711] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:49.520 [2024-07-25 07:27:21.745775] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:49.520 [2024-07-25 07:27:21.745828] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:49.520 [2024-07-25 07:27:21.746048] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25e75d0 00:21:49.520 [2024-07-25 07:27:21.746058] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:49.520 [2024-07-25 07:27:21.746244] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2438c70 00:21:49.520 [2024-07-25 07:27:21.746382] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25e75d0 00:21:49.520 [2024-07-25 07:27:21.746392] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25e75d0 00:21:49.520 [2024-07-25 07:27:21.746482] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.520 07:27:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.520 07:27:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.520 "name": "raid_bdev1", 00:21:49.520 "uuid": "d9b3f109-aa43-42a5-9838-2a08df7abaab", 00:21:49.520 "strip_size_kb": 64, 00:21:49.520 "state": "online", 00:21:49.520 "raid_level": "concat", 00:21:49.520 "superblock": true, 00:21:49.520 "num_base_bdevs": 4, 00:21:49.520 "num_base_bdevs_discovered": 4, 00:21:49.520 "num_base_bdevs_operational": 4, 00:21:49.520 "base_bdevs_list": [ 00:21:49.520 { 00:21:49.520 "name": "BaseBdev1", 00:21:49.520 "uuid": "3b57acff-4bd1-512b-b081-99a28f763e5e", 00:21:49.520 "is_configured": true, 00:21:49.520 "data_offset": 2048, 00:21:49.520 "data_size": 63488 00:21:49.520 }, 00:21:49.520 { 00:21:49.520 "name": "BaseBdev2", 00:21:49.520 "uuid": "7fb6aec1-3ede-547d-b6dd-07f494c67276", 00:21:49.520 "is_configured": true, 00:21:49.520 "data_offset": 2048, 00:21:49.520 "data_size": 63488 00:21:49.520 }, 00:21:49.520 { 00:21:49.520 "name": "BaseBdev3", 00:21:49.520 "uuid": "b8d4433a-d2db-5648-a2da-ff8667e48b79", 00:21:49.520 "is_configured": true, 00:21:49.520 "data_offset": 2048, 00:21:49.520 "data_size": 63488 00:21:49.520 }, 00:21:49.520 { 00:21:49.520 "name": "BaseBdev4", 00:21:49.520 "uuid": "cf3382de-24e5-5252-9d32-aeb04ef6b678", 00:21:49.520 "is_configured": true, 00:21:49.520 "data_offset": 2048, 00:21:49.520 "data_size": 63488 00:21:49.520 } 00:21:49.520 ] 00:21:49.520 }' 00:21:49.520 07:27:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.521 07:27:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.105 07:27:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:50.105 07:27:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:50.105 [2024-07-25 07:27:22.591118] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24dbf40 00:21:51.039 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.297 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.555 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.555 "name": "raid_bdev1", 00:21:51.555 "uuid": "d9b3f109-aa43-42a5-9838-2a08df7abaab", 00:21:51.555 "strip_size_kb": 64, 00:21:51.555 "state": "online", 00:21:51.555 "raid_level": "concat", 00:21:51.555 "superblock": true, 00:21:51.555 "num_base_bdevs": 4, 00:21:51.555 "num_base_bdevs_discovered": 4, 00:21:51.555 "num_base_bdevs_operational": 4, 00:21:51.555 "base_bdevs_list": [ 00:21:51.555 { 00:21:51.555 "name": "BaseBdev1", 00:21:51.555 "uuid": "3b57acff-4bd1-512b-b081-99a28f763e5e", 00:21:51.555 "is_configured": true, 00:21:51.555 "data_offset": 2048, 00:21:51.555 "data_size": 63488 00:21:51.555 }, 00:21:51.555 { 00:21:51.555 "name": "BaseBdev2", 00:21:51.556 "uuid": "7fb6aec1-3ede-547d-b6dd-07f494c67276", 00:21:51.556 "is_configured": true, 00:21:51.556 "data_offset": 2048, 00:21:51.556 "data_size": 63488 00:21:51.556 }, 00:21:51.556 { 00:21:51.556 "name": "BaseBdev3", 00:21:51.556 "uuid": "b8d4433a-d2db-5648-a2da-ff8667e48b79", 00:21:51.556 "is_configured": true, 00:21:51.556 "data_offset": 2048, 00:21:51.556 "data_size": 63488 00:21:51.556 }, 00:21:51.556 { 00:21:51.556 "name": "BaseBdev4", 00:21:51.556 "uuid": "cf3382de-24e5-5252-9d32-aeb04ef6b678", 00:21:51.556 "is_configured": true, 00:21:51.556 "data_offset": 2048, 00:21:51.556 "data_size": 63488 00:21:51.556 } 00:21:51.556 ] 00:21:51.556 }' 00:21:51.556 07:27:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.556 07:27:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.121 07:27:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:52.379 [2024-07-25 07:27:24.765877] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:52.379 [2024-07-25 07:27:24.765913] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:52.379 [2024-07-25 07:27:24.768844] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:52.379 [2024-07-25 07:27:24.768883] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.379 [2024-07-25 07:27:24.768919] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:52.379 [2024-07-25 07:27:24.768929] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e75d0 name raid_bdev1, state offline 00:21:52.379 0 00:21:52.379 07:27:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1694345 00:21:52.379 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1694345 ']' 00:21:52.379 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1694345 00:21:52.379 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:21:52.379 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:52.380 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1694345 00:21:52.380 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:52.380 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:52.380 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1694345' 00:21:52.380 killing process with pid 1694345 00:21:52.380 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1694345 00:21:52.380 [2024-07-25 07:27:24.835569] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:52.380 07:27:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1694345 00:21:52.380 [2024-07-25 07:27:24.863200] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.6dRrMpIrWR 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:21:52.638 00:21:52.638 real 0m7.232s 00:21:52.638 user 0m11.482s 00:21:52.638 sys 0m1.306s 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:52.638 07:27:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.638 ************************************ 00:21:52.638 END TEST raid_write_error_test 00:21:52.638 ************************************ 00:21:52.638 07:27:25 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:21:52.638 07:27:25 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:52.638 07:27:25 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:52.638 07:27:25 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:52.638 07:27:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:52.638 ************************************ 00:21:52.638 START TEST raid_state_function_test 00:21:52.638 ************************************ 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 false 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:52.638 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:52.896 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1695745 00:21:52.896 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1695745' 00:21:52.896 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:52.896 Process raid pid: 1695745 00:21:52.896 07:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1695745 /var/tmp/spdk-raid.sock 00:21:52.896 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1695745 ']' 00:21:52.897 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:52.897 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:52.897 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:52.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:52.897 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:52.897 07:27:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.897 [2024-07-25 07:27:25.227933] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:21:52.897 [2024-07-25 07:27:25.227989] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:52.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:52.897 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:52.897 [2024-07-25 07:27:25.359432] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:53.156 [2024-07-25 07:27:25.446125] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:21:53.156 [2024-07-25 07:27:25.503749] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:53.156 [2024-07-25 07:27:25.503777] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:53.722 07:27:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:53.722 07:27:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:21:53.722 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:53.980 [2024-07-25 07:27:26.337517] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:53.980 [2024-07-25 07:27:26.337555] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:53.980 [2024-07-25 07:27:26.337565] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:53.980 [2024-07-25 07:27:26.337575] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:53.980 [2024-07-25 07:27:26.337583] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:53.980 [2024-07-25 07:27:26.337593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:53.980 [2024-07-25 07:27:26.337600] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:53.980 [2024-07-25 07:27:26.337610] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:53.980 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.237 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.238 "name": "Existed_Raid", 00:21:54.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.238 "strip_size_kb": 0, 00:21:54.238 "state": "configuring", 00:21:54.238 "raid_level": "raid1", 00:21:54.238 "superblock": false, 00:21:54.238 "num_base_bdevs": 4, 00:21:54.238 "num_base_bdevs_discovered": 0, 00:21:54.238 "num_base_bdevs_operational": 4, 00:21:54.238 "base_bdevs_list": [ 00:21:54.238 { 00:21:54.238 "name": "BaseBdev1", 00:21:54.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.238 "is_configured": false, 00:21:54.238 "data_offset": 0, 00:21:54.238 "data_size": 0 00:21:54.238 }, 00:21:54.238 { 00:21:54.238 "name": "BaseBdev2", 00:21:54.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.238 "is_configured": false, 00:21:54.238 "data_offset": 0, 00:21:54.238 "data_size": 0 00:21:54.238 }, 00:21:54.238 { 00:21:54.238 "name": "BaseBdev3", 00:21:54.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.238 "is_configured": false, 00:21:54.238 "data_offset": 0, 00:21:54.238 "data_size": 0 00:21:54.238 }, 00:21:54.238 { 00:21:54.238 "name": "BaseBdev4", 00:21:54.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.238 "is_configured": false, 00:21:54.238 "data_offset": 0, 00:21:54.238 "data_size": 0 00:21:54.238 } 00:21:54.238 ] 00:21:54.238 }' 00:21:54.238 07:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.238 07:27:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.804 07:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:55.089 [2024-07-25 07:27:27.364108] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:55.089 [2024-07-25 07:27:27.364137] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2695ee0 name Existed_Raid, state configuring 00:21:55.089 07:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:55.089 [2024-07-25 07:27:27.592715] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:55.089 [2024-07-25 07:27:27.592742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:55.089 [2024-07-25 07:27:27.592751] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:55.089 [2024-07-25 07:27:27.592761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:55.090 [2024-07-25 07:27:27.592769] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:55.090 [2024-07-25 07:27:27.592779] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:55.090 [2024-07-25 07:27:27.592787] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:55.090 [2024-07-25 07:27:27.592797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:55.348 [2024-07-25 07:27:27.826753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:55.348 BaseBdev1 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:55.348 07:27:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:55.606 07:27:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:55.865 [ 00:21:55.865 { 00:21:55.865 "name": "BaseBdev1", 00:21:55.865 "aliases": [ 00:21:55.865 "0f540373-0afe-45ce-864a-e1411a65c5e1" 00:21:55.865 ], 00:21:55.865 "product_name": "Malloc disk", 00:21:55.865 "block_size": 512, 00:21:55.865 "num_blocks": 65536, 00:21:55.865 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:21:55.865 "assigned_rate_limits": { 00:21:55.865 "rw_ios_per_sec": 0, 00:21:55.865 "rw_mbytes_per_sec": 0, 00:21:55.865 "r_mbytes_per_sec": 0, 00:21:55.865 "w_mbytes_per_sec": 0 00:21:55.865 }, 00:21:55.865 "claimed": true, 00:21:55.865 "claim_type": "exclusive_write", 00:21:55.865 "zoned": false, 00:21:55.865 "supported_io_types": { 00:21:55.865 "read": true, 00:21:55.865 "write": true, 00:21:55.865 "unmap": true, 00:21:55.865 "flush": true, 00:21:55.865 "reset": true, 00:21:55.865 "nvme_admin": false, 00:21:55.865 "nvme_io": false, 00:21:55.865 "nvme_io_md": false, 00:21:55.865 "write_zeroes": true, 00:21:55.865 "zcopy": true, 00:21:55.865 "get_zone_info": false, 00:21:55.865 "zone_management": false, 00:21:55.865 "zone_append": false, 00:21:55.865 "compare": false, 00:21:55.865 "compare_and_write": false, 00:21:55.865 "abort": true, 00:21:55.865 "seek_hole": false, 00:21:55.865 "seek_data": false, 00:21:55.865 "copy": true, 00:21:55.865 "nvme_iov_md": false 00:21:55.865 }, 00:21:55.865 "memory_domains": [ 00:21:55.865 { 00:21:55.865 "dma_device_id": "system", 00:21:55.865 "dma_device_type": 1 00:21:55.865 }, 00:21:55.865 { 00:21:55.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.865 "dma_device_type": 2 00:21:55.865 } 00:21:55.865 ], 00:21:55.865 "driver_specific": {} 00:21:55.865 } 00:21:55.865 ] 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.865 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:56.124 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.124 "name": "Existed_Raid", 00:21:56.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.124 "strip_size_kb": 0, 00:21:56.124 "state": "configuring", 00:21:56.124 "raid_level": "raid1", 00:21:56.124 "superblock": false, 00:21:56.124 "num_base_bdevs": 4, 00:21:56.124 "num_base_bdevs_discovered": 1, 00:21:56.124 "num_base_bdevs_operational": 4, 00:21:56.124 "base_bdevs_list": [ 00:21:56.124 { 00:21:56.124 "name": "BaseBdev1", 00:21:56.124 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:21:56.124 "is_configured": true, 00:21:56.124 "data_offset": 0, 00:21:56.124 "data_size": 65536 00:21:56.124 }, 00:21:56.124 { 00:21:56.124 "name": "BaseBdev2", 00:21:56.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.124 "is_configured": false, 00:21:56.124 "data_offset": 0, 00:21:56.124 "data_size": 0 00:21:56.124 }, 00:21:56.124 { 00:21:56.124 "name": "BaseBdev3", 00:21:56.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.124 "is_configured": false, 00:21:56.124 "data_offset": 0, 00:21:56.124 "data_size": 0 00:21:56.124 }, 00:21:56.124 { 00:21:56.124 "name": "BaseBdev4", 00:21:56.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.124 "is_configured": false, 00:21:56.124 "data_offset": 0, 00:21:56.124 "data_size": 0 00:21:56.124 } 00:21:56.124 ] 00:21:56.124 }' 00:21:56.124 07:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.124 07:27:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.690 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:56.947 [2024-07-25 07:27:29.310665] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:56.947 [2024-07-25 07:27:29.310702] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2695750 name Existed_Raid, state configuring 00:21:56.947 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:57.206 [2024-07-25 07:27:29.539289] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:57.206 [2024-07-25 07:27:29.540678] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:57.206 [2024-07-25 07:27:29.540710] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:57.206 [2024-07-25 07:27:29.540720] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:57.206 [2024-07-25 07:27:29.540731] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:57.206 [2024-07-25 07:27:29.540739] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:57.206 [2024-07-25 07:27:29.540749] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.206 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:57.465 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.465 "name": "Existed_Raid", 00:21:57.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.465 "strip_size_kb": 0, 00:21:57.465 "state": "configuring", 00:21:57.465 "raid_level": "raid1", 00:21:57.465 "superblock": false, 00:21:57.465 "num_base_bdevs": 4, 00:21:57.465 "num_base_bdevs_discovered": 1, 00:21:57.465 "num_base_bdevs_operational": 4, 00:21:57.465 "base_bdevs_list": [ 00:21:57.465 { 00:21:57.465 "name": "BaseBdev1", 00:21:57.465 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:21:57.465 "is_configured": true, 00:21:57.465 "data_offset": 0, 00:21:57.465 "data_size": 65536 00:21:57.465 }, 00:21:57.465 { 00:21:57.465 "name": "BaseBdev2", 00:21:57.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.465 "is_configured": false, 00:21:57.465 "data_offset": 0, 00:21:57.465 "data_size": 0 00:21:57.465 }, 00:21:57.465 { 00:21:57.465 "name": "BaseBdev3", 00:21:57.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.465 "is_configured": false, 00:21:57.465 "data_offset": 0, 00:21:57.465 "data_size": 0 00:21:57.465 }, 00:21:57.465 { 00:21:57.465 "name": "BaseBdev4", 00:21:57.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.465 "is_configured": false, 00:21:57.465 "data_offset": 0, 00:21:57.465 "data_size": 0 00:21:57.465 } 00:21:57.465 ] 00:21:57.465 }' 00:21:57.465 07:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.465 07:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.031 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:58.289 [2024-07-25 07:27:30.577202] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:58.289 BaseBdev2 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:58.289 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:58.547 [ 00:21:58.547 { 00:21:58.547 "name": "BaseBdev2", 00:21:58.547 "aliases": [ 00:21:58.547 "235e5878-d832-4b45-8f3c-8f4da71d19c0" 00:21:58.547 ], 00:21:58.547 "product_name": "Malloc disk", 00:21:58.547 "block_size": 512, 00:21:58.547 "num_blocks": 65536, 00:21:58.547 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:21:58.547 "assigned_rate_limits": { 00:21:58.547 "rw_ios_per_sec": 0, 00:21:58.547 "rw_mbytes_per_sec": 0, 00:21:58.547 "r_mbytes_per_sec": 0, 00:21:58.547 "w_mbytes_per_sec": 0 00:21:58.547 }, 00:21:58.547 "claimed": true, 00:21:58.547 "claim_type": "exclusive_write", 00:21:58.547 "zoned": false, 00:21:58.547 "supported_io_types": { 00:21:58.547 "read": true, 00:21:58.547 "write": true, 00:21:58.547 "unmap": true, 00:21:58.547 "flush": true, 00:21:58.547 "reset": true, 00:21:58.547 "nvme_admin": false, 00:21:58.547 "nvme_io": false, 00:21:58.547 "nvme_io_md": false, 00:21:58.547 "write_zeroes": true, 00:21:58.547 "zcopy": true, 00:21:58.547 "get_zone_info": false, 00:21:58.547 "zone_management": false, 00:21:58.547 "zone_append": false, 00:21:58.547 "compare": false, 00:21:58.547 "compare_and_write": false, 00:21:58.547 "abort": true, 00:21:58.547 "seek_hole": false, 00:21:58.547 "seek_data": false, 00:21:58.547 "copy": true, 00:21:58.547 "nvme_iov_md": false 00:21:58.547 }, 00:21:58.547 "memory_domains": [ 00:21:58.547 { 00:21:58.547 "dma_device_id": "system", 00:21:58.547 "dma_device_type": 1 00:21:58.547 }, 00:21:58.547 { 00:21:58.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.547 "dma_device_type": 2 00:21:58.547 } 00:21:58.547 ], 00:21:58.547 "driver_specific": {} 00:21:58.547 } 00:21:58.547 ] 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.547 07:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:58.805 07:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.805 "name": "Existed_Raid", 00:21:58.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.805 "strip_size_kb": 0, 00:21:58.805 "state": "configuring", 00:21:58.805 "raid_level": "raid1", 00:21:58.805 "superblock": false, 00:21:58.805 "num_base_bdevs": 4, 00:21:58.805 "num_base_bdevs_discovered": 2, 00:21:58.805 "num_base_bdevs_operational": 4, 00:21:58.805 "base_bdevs_list": [ 00:21:58.805 { 00:21:58.805 "name": "BaseBdev1", 00:21:58.805 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:21:58.805 "is_configured": true, 00:21:58.805 "data_offset": 0, 00:21:58.805 "data_size": 65536 00:21:58.805 }, 00:21:58.805 { 00:21:58.805 "name": "BaseBdev2", 00:21:58.805 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:21:58.805 "is_configured": true, 00:21:58.805 "data_offset": 0, 00:21:58.805 "data_size": 65536 00:21:58.805 }, 00:21:58.805 { 00:21:58.805 "name": "BaseBdev3", 00:21:58.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.805 "is_configured": false, 00:21:58.805 "data_offset": 0, 00:21:58.805 "data_size": 0 00:21:58.805 }, 00:21:58.805 { 00:21:58.805 "name": "BaseBdev4", 00:21:58.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.805 "is_configured": false, 00:21:58.805 "data_offset": 0, 00:21:58.805 "data_size": 0 00:21:58.805 } 00:21:58.805 ] 00:21:58.805 }' 00:21:58.805 07:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.806 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.372 07:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:59.630 [2024-07-25 07:27:31.980056] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:59.630 BaseBdev3 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:59.630 07:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:59.888 07:27:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:59.888 [ 00:21:59.888 { 00:21:59.888 "name": "BaseBdev3", 00:21:59.888 "aliases": [ 00:21:59.888 "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0" 00:21:59.888 ], 00:21:59.888 "product_name": "Malloc disk", 00:21:59.888 "block_size": 512, 00:21:59.888 "num_blocks": 65536, 00:21:59.888 "uuid": "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0", 00:21:59.888 "assigned_rate_limits": { 00:21:59.888 "rw_ios_per_sec": 0, 00:21:59.888 "rw_mbytes_per_sec": 0, 00:21:59.888 "r_mbytes_per_sec": 0, 00:21:59.888 "w_mbytes_per_sec": 0 00:21:59.888 }, 00:21:59.888 "claimed": true, 00:21:59.888 "claim_type": "exclusive_write", 00:21:59.888 "zoned": false, 00:21:59.888 "supported_io_types": { 00:21:59.889 "read": true, 00:21:59.889 "write": true, 00:21:59.889 "unmap": true, 00:21:59.889 "flush": true, 00:21:59.889 "reset": true, 00:21:59.889 "nvme_admin": false, 00:21:59.889 "nvme_io": false, 00:21:59.889 "nvme_io_md": false, 00:21:59.889 "write_zeroes": true, 00:21:59.889 "zcopy": true, 00:21:59.889 "get_zone_info": false, 00:21:59.889 "zone_management": false, 00:21:59.889 "zone_append": false, 00:21:59.889 "compare": false, 00:21:59.889 "compare_and_write": false, 00:21:59.889 "abort": true, 00:21:59.889 "seek_hole": false, 00:21:59.889 "seek_data": false, 00:21:59.889 "copy": true, 00:21:59.889 "nvme_iov_md": false 00:21:59.889 }, 00:21:59.889 "memory_domains": [ 00:21:59.889 { 00:21:59.889 "dma_device_id": "system", 00:21:59.889 "dma_device_type": 1 00:21:59.889 }, 00:21:59.889 { 00:21:59.889 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.889 "dma_device_type": 2 00:21:59.889 } 00:21:59.889 ], 00:21:59.889 "driver_specific": {} 00:21:59.889 } 00:21:59.889 ] 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.147 "name": "Existed_Raid", 00:22:00.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.147 "strip_size_kb": 0, 00:22:00.147 "state": "configuring", 00:22:00.147 "raid_level": "raid1", 00:22:00.147 "superblock": false, 00:22:00.147 "num_base_bdevs": 4, 00:22:00.147 "num_base_bdevs_discovered": 3, 00:22:00.147 "num_base_bdevs_operational": 4, 00:22:00.147 "base_bdevs_list": [ 00:22:00.147 { 00:22:00.147 "name": "BaseBdev1", 00:22:00.147 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:22:00.147 "is_configured": true, 00:22:00.147 "data_offset": 0, 00:22:00.147 "data_size": 65536 00:22:00.147 }, 00:22:00.147 { 00:22:00.147 "name": "BaseBdev2", 00:22:00.147 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:22:00.147 "is_configured": true, 00:22:00.147 "data_offset": 0, 00:22:00.147 "data_size": 65536 00:22:00.147 }, 00:22:00.147 { 00:22:00.147 "name": "BaseBdev3", 00:22:00.147 "uuid": "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0", 00:22:00.147 "is_configured": true, 00:22:00.147 "data_offset": 0, 00:22:00.147 "data_size": 65536 00:22:00.147 }, 00:22:00.147 { 00:22:00.147 "name": "BaseBdev4", 00:22:00.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.147 "is_configured": false, 00:22:00.147 "data_offset": 0, 00:22:00.147 "data_size": 0 00:22:00.147 } 00:22:00.147 ] 00:22:00.147 }' 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.147 07:27:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.714 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:00.973 [2024-07-25 07:27:33.447173] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:00.973 [2024-07-25 07:27:33.447212] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26967b0 00:22:00.973 [2024-07-25 07:27:33.447219] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:00.973 [2024-07-25 07:27:33.447394] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28499d0 00:22:00.973 [2024-07-25 07:27:33.447521] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26967b0 00:22:00.973 [2024-07-25 07:27:33.447530] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26967b0 00:22:00.973 [2024-07-25 07:27:33.447684] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.973 BaseBdev4 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:00.973 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:01.230 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:01.488 [ 00:22:01.488 { 00:22:01.488 "name": "BaseBdev4", 00:22:01.488 "aliases": [ 00:22:01.488 "8f10355a-2ebb-4e22-8dba-21555aa4af99" 00:22:01.488 ], 00:22:01.488 "product_name": "Malloc disk", 00:22:01.488 "block_size": 512, 00:22:01.488 "num_blocks": 65536, 00:22:01.488 "uuid": "8f10355a-2ebb-4e22-8dba-21555aa4af99", 00:22:01.488 "assigned_rate_limits": { 00:22:01.488 "rw_ios_per_sec": 0, 00:22:01.488 "rw_mbytes_per_sec": 0, 00:22:01.488 "r_mbytes_per_sec": 0, 00:22:01.488 "w_mbytes_per_sec": 0 00:22:01.488 }, 00:22:01.488 "claimed": true, 00:22:01.488 "claim_type": "exclusive_write", 00:22:01.488 "zoned": false, 00:22:01.488 "supported_io_types": { 00:22:01.488 "read": true, 00:22:01.488 "write": true, 00:22:01.488 "unmap": true, 00:22:01.488 "flush": true, 00:22:01.488 "reset": true, 00:22:01.488 "nvme_admin": false, 00:22:01.488 "nvme_io": false, 00:22:01.488 "nvme_io_md": false, 00:22:01.488 "write_zeroes": true, 00:22:01.488 "zcopy": true, 00:22:01.488 "get_zone_info": false, 00:22:01.488 "zone_management": false, 00:22:01.488 "zone_append": false, 00:22:01.488 "compare": false, 00:22:01.488 "compare_and_write": false, 00:22:01.488 "abort": true, 00:22:01.488 "seek_hole": false, 00:22:01.488 "seek_data": false, 00:22:01.488 "copy": true, 00:22:01.488 "nvme_iov_md": false 00:22:01.488 }, 00:22:01.488 "memory_domains": [ 00:22:01.488 { 00:22:01.488 "dma_device_id": "system", 00:22:01.488 "dma_device_type": 1 00:22:01.488 }, 00:22:01.488 { 00:22:01.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.488 "dma_device_type": 2 00:22:01.488 } 00:22:01.488 ], 00:22:01.488 "driver_specific": {} 00:22:01.488 } 00:22:01.488 ] 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.488 07:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.747 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.747 "name": "Existed_Raid", 00:22:01.747 "uuid": "07922a2f-c376-4f0f-9c80-4d732b1b8e9c", 00:22:01.747 "strip_size_kb": 0, 00:22:01.747 "state": "online", 00:22:01.747 "raid_level": "raid1", 00:22:01.747 "superblock": false, 00:22:01.747 "num_base_bdevs": 4, 00:22:01.747 "num_base_bdevs_discovered": 4, 00:22:01.747 "num_base_bdevs_operational": 4, 00:22:01.747 "base_bdevs_list": [ 00:22:01.747 { 00:22:01.747 "name": "BaseBdev1", 00:22:01.747 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:22:01.747 "is_configured": true, 00:22:01.747 "data_offset": 0, 00:22:01.747 "data_size": 65536 00:22:01.747 }, 00:22:01.747 { 00:22:01.747 "name": "BaseBdev2", 00:22:01.747 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:22:01.747 "is_configured": true, 00:22:01.747 "data_offset": 0, 00:22:01.747 "data_size": 65536 00:22:01.747 }, 00:22:01.747 { 00:22:01.747 "name": "BaseBdev3", 00:22:01.747 "uuid": "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0", 00:22:01.747 "is_configured": true, 00:22:01.747 "data_offset": 0, 00:22:01.747 "data_size": 65536 00:22:01.747 }, 00:22:01.747 { 00:22:01.747 "name": "BaseBdev4", 00:22:01.747 "uuid": "8f10355a-2ebb-4e22-8dba-21555aa4af99", 00:22:01.747 "is_configured": true, 00:22:01.747 "data_offset": 0, 00:22:01.747 "data_size": 65536 00:22:01.747 } 00:22:01.747 ] 00:22:01.747 }' 00:22:01.747 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.747 07:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:02.313 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:02.572 [2024-07-25 07:27:34.931395] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:02.572 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:02.572 "name": "Existed_Raid", 00:22:02.572 "aliases": [ 00:22:02.572 "07922a2f-c376-4f0f-9c80-4d732b1b8e9c" 00:22:02.572 ], 00:22:02.572 "product_name": "Raid Volume", 00:22:02.572 "block_size": 512, 00:22:02.572 "num_blocks": 65536, 00:22:02.572 "uuid": "07922a2f-c376-4f0f-9c80-4d732b1b8e9c", 00:22:02.572 "assigned_rate_limits": { 00:22:02.572 "rw_ios_per_sec": 0, 00:22:02.572 "rw_mbytes_per_sec": 0, 00:22:02.572 "r_mbytes_per_sec": 0, 00:22:02.572 "w_mbytes_per_sec": 0 00:22:02.572 }, 00:22:02.572 "claimed": false, 00:22:02.572 "zoned": false, 00:22:02.572 "supported_io_types": { 00:22:02.572 "read": true, 00:22:02.572 "write": true, 00:22:02.572 "unmap": false, 00:22:02.572 "flush": false, 00:22:02.572 "reset": true, 00:22:02.572 "nvme_admin": false, 00:22:02.572 "nvme_io": false, 00:22:02.572 "nvme_io_md": false, 00:22:02.572 "write_zeroes": true, 00:22:02.572 "zcopy": false, 00:22:02.572 "get_zone_info": false, 00:22:02.572 "zone_management": false, 00:22:02.572 "zone_append": false, 00:22:02.572 "compare": false, 00:22:02.572 "compare_and_write": false, 00:22:02.572 "abort": false, 00:22:02.572 "seek_hole": false, 00:22:02.572 "seek_data": false, 00:22:02.572 "copy": false, 00:22:02.572 "nvme_iov_md": false 00:22:02.572 }, 00:22:02.572 "memory_domains": [ 00:22:02.572 { 00:22:02.572 "dma_device_id": "system", 00:22:02.572 "dma_device_type": 1 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.572 "dma_device_type": 2 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "system", 00:22:02.572 "dma_device_type": 1 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.572 "dma_device_type": 2 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "system", 00:22:02.572 "dma_device_type": 1 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.572 "dma_device_type": 2 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "system", 00:22:02.572 "dma_device_type": 1 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.572 "dma_device_type": 2 00:22:02.572 } 00:22:02.572 ], 00:22:02.572 "driver_specific": { 00:22:02.572 "raid": { 00:22:02.572 "uuid": "07922a2f-c376-4f0f-9c80-4d732b1b8e9c", 00:22:02.572 "strip_size_kb": 0, 00:22:02.572 "state": "online", 00:22:02.572 "raid_level": "raid1", 00:22:02.572 "superblock": false, 00:22:02.572 "num_base_bdevs": 4, 00:22:02.572 "num_base_bdevs_discovered": 4, 00:22:02.572 "num_base_bdevs_operational": 4, 00:22:02.572 "base_bdevs_list": [ 00:22:02.572 { 00:22:02.572 "name": "BaseBdev1", 00:22:02.572 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:22:02.572 "is_configured": true, 00:22:02.572 "data_offset": 0, 00:22:02.572 "data_size": 65536 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "name": "BaseBdev2", 00:22:02.572 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:22:02.572 "is_configured": true, 00:22:02.572 "data_offset": 0, 00:22:02.572 "data_size": 65536 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "name": "BaseBdev3", 00:22:02.572 "uuid": "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0", 00:22:02.572 "is_configured": true, 00:22:02.572 "data_offset": 0, 00:22:02.572 "data_size": 65536 00:22:02.572 }, 00:22:02.572 { 00:22:02.572 "name": "BaseBdev4", 00:22:02.572 "uuid": "8f10355a-2ebb-4e22-8dba-21555aa4af99", 00:22:02.572 "is_configured": true, 00:22:02.572 "data_offset": 0, 00:22:02.572 "data_size": 65536 00:22:02.572 } 00:22:02.572 ] 00:22:02.572 } 00:22:02.572 } 00:22:02.572 }' 00:22:02.572 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:02.572 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:02.572 BaseBdev2 00:22:02.572 BaseBdev3 00:22:02.572 BaseBdev4' 00:22:02.572 07:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.572 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:02.572 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.830 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.830 "name": "BaseBdev1", 00:22:02.830 "aliases": [ 00:22:02.830 "0f540373-0afe-45ce-864a-e1411a65c5e1" 00:22:02.830 ], 00:22:02.830 "product_name": "Malloc disk", 00:22:02.830 "block_size": 512, 00:22:02.830 "num_blocks": 65536, 00:22:02.830 "uuid": "0f540373-0afe-45ce-864a-e1411a65c5e1", 00:22:02.830 "assigned_rate_limits": { 00:22:02.830 "rw_ios_per_sec": 0, 00:22:02.830 "rw_mbytes_per_sec": 0, 00:22:02.830 "r_mbytes_per_sec": 0, 00:22:02.830 "w_mbytes_per_sec": 0 00:22:02.830 }, 00:22:02.830 "claimed": true, 00:22:02.830 "claim_type": "exclusive_write", 00:22:02.830 "zoned": false, 00:22:02.830 "supported_io_types": { 00:22:02.830 "read": true, 00:22:02.830 "write": true, 00:22:02.830 "unmap": true, 00:22:02.830 "flush": true, 00:22:02.830 "reset": true, 00:22:02.830 "nvme_admin": false, 00:22:02.830 "nvme_io": false, 00:22:02.830 "nvme_io_md": false, 00:22:02.830 "write_zeroes": true, 00:22:02.830 "zcopy": true, 00:22:02.830 "get_zone_info": false, 00:22:02.830 "zone_management": false, 00:22:02.830 "zone_append": false, 00:22:02.830 "compare": false, 00:22:02.830 "compare_and_write": false, 00:22:02.830 "abort": true, 00:22:02.830 "seek_hole": false, 00:22:02.830 "seek_data": false, 00:22:02.830 "copy": true, 00:22:02.830 "nvme_iov_md": false 00:22:02.830 }, 00:22:02.830 "memory_domains": [ 00:22:02.830 { 00:22:02.830 "dma_device_id": "system", 00:22:02.830 "dma_device_type": 1 00:22:02.830 }, 00:22:02.830 { 00:22:02.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.830 "dma_device_type": 2 00:22:02.830 } 00:22:02.830 ], 00:22:02.830 "driver_specific": {} 00:22:02.830 }' 00:22:02.830 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.830 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.830 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.830 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.087 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.087 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:03.088 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:03.345 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:03.345 "name": "BaseBdev2", 00:22:03.345 "aliases": [ 00:22:03.345 "235e5878-d832-4b45-8f3c-8f4da71d19c0" 00:22:03.345 ], 00:22:03.345 "product_name": "Malloc disk", 00:22:03.345 "block_size": 512, 00:22:03.345 "num_blocks": 65536, 00:22:03.345 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:22:03.345 "assigned_rate_limits": { 00:22:03.345 "rw_ios_per_sec": 0, 00:22:03.345 "rw_mbytes_per_sec": 0, 00:22:03.345 "r_mbytes_per_sec": 0, 00:22:03.345 "w_mbytes_per_sec": 0 00:22:03.345 }, 00:22:03.345 "claimed": true, 00:22:03.345 "claim_type": "exclusive_write", 00:22:03.345 "zoned": false, 00:22:03.345 "supported_io_types": { 00:22:03.345 "read": true, 00:22:03.345 "write": true, 00:22:03.345 "unmap": true, 00:22:03.345 "flush": true, 00:22:03.345 "reset": true, 00:22:03.345 "nvme_admin": false, 00:22:03.345 "nvme_io": false, 00:22:03.345 "nvme_io_md": false, 00:22:03.345 "write_zeroes": true, 00:22:03.345 "zcopy": true, 00:22:03.345 "get_zone_info": false, 00:22:03.345 "zone_management": false, 00:22:03.345 "zone_append": false, 00:22:03.345 "compare": false, 00:22:03.345 "compare_and_write": false, 00:22:03.345 "abort": true, 00:22:03.345 "seek_hole": false, 00:22:03.345 "seek_data": false, 00:22:03.345 "copy": true, 00:22:03.345 "nvme_iov_md": false 00:22:03.345 }, 00:22:03.345 "memory_domains": [ 00:22:03.345 { 00:22:03.345 "dma_device_id": "system", 00:22:03.345 "dma_device_type": 1 00:22:03.345 }, 00:22:03.345 { 00:22:03.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.345 "dma_device_type": 2 00:22:03.345 } 00:22:03.345 ], 00:22:03.345 "driver_specific": {} 00:22:03.345 }' 00:22:03.345 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.345 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.602 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:03.603 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.603 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.603 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:03.603 07:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.603 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.603 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.603 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.603 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.860 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:03.860 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:03.860 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:03.860 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:03.860 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:03.860 "name": "BaseBdev3", 00:22:03.860 "aliases": [ 00:22:03.860 "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0" 00:22:03.860 ], 00:22:03.860 "product_name": "Malloc disk", 00:22:03.860 "block_size": 512, 00:22:03.860 "num_blocks": 65536, 00:22:03.860 "uuid": "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0", 00:22:03.860 "assigned_rate_limits": { 00:22:03.860 "rw_ios_per_sec": 0, 00:22:03.860 "rw_mbytes_per_sec": 0, 00:22:03.860 "r_mbytes_per_sec": 0, 00:22:03.860 "w_mbytes_per_sec": 0 00:22:03.860 }, 00:22:03.860 "claimed": true, 00:22:03.860 "claim_type": "exclusive_write", 00:22:03.860 "zoned": false, 00:22:03.860 "supported_io_types": { 00:22:03.860 "read": true, 00:22:03.860 "write": true, 00:22:03.860 "unmap": true, 00:22:03.860 "flush": true, 00:22:03.860 "reset": true, 00:22:03.860 "nvme_admin": false, 00:22:03.860 "nvme_io": false, 00:22:03.860 "nvme_io_md": false, 00:22:03.860 "write_zeroes": true, 00:22:03.860 "zcopy": true, 00:22:03.860 "get_zone_info": false, 00:22:03.860 "zone_management": false, 00:22:03.860 "zone_append": false, 00:22:03.860 "compare": false, 00:22:03.860 "compare_and_write": false, 00:22:03.860 "abort": true, 00:22:03.860 "seek_hole": false, 00:22:03.860 "seek_data": false, 00:22:03.860 "copy": true, 00:22:03.860 "nvme_iov_md": false 00:22:03.860 }, 00:22:03.860 "memory_domains": [ 00:22:03.860 { 00:22:03.860 "dma_device_id": "system", 00:22:03.860 "dma_device_type": 1 00:22:03.860 }, 00:22:03.860 { 00:22:03.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.860 "dma_device_type": 2 00:22:03.860 } 00:22:03.860 ], 00:22:03.860 "driver_specific": {} 00:22:03.860 }' 00:22:03.860 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.119 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.377 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.378 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.378 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.378 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:04.378 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.636 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.636 "name": "BaseBdev4", 00:22:04.636 "aliases": [ 00:22:04.636 "8f10355a-2ebb-4e22-8dba-21555aa4af99" 00:22:04.636 ], 00:22:04.636 "product_name": "Malloc disk", 00:22:04.636 "block_size": 512, 00:22:04.636 "num_blocks": 65536, 00:22:04.636 "uuid": "8f10355a-2ebb-4e22-8dba-21555aa4af99", 00:22:04.636 "assigned_rate_limits": { 00:22:04.636 "rw_ios_per_sec": 0, 00:22:04.636 "rw_mbytes_per_sec": 0, 00:22:04.636 "r_mbytes_per_sec": 0, 00:22:04.636 "w_mbytes_per_sec": 0 00:22:04.636 }, 00:22:04.636 "claimed": true, 00:22:04.636 "claim_type": "exclusive_write", 00:22:04.636 "zoned": false, 00:22:04.636 "supported_io_types": { 00:22:04.636 "read": true, 00:22:04.636 "write": true, 00:22:04.636 "unmap": true, 00:22:04.636 "flush": true, 00:22:04.636 "reset": true, 00:22:04.636 "nvme_admin": false, 00:22:04.636 "nvme_io": false, 00:22:04.636 "nvme_io_md": false, 00:22:04.636 "write_zeroes": true, 00:22:04.636 "zcopy": true, 00:22:04.636 "get_zone_info": false, 00:22:04.636 "zone_management": false, 00:22:04.636 "zone_append": false, 00:22:04.636 "compare": false, 00:22:04.636 "compare_and_write": false, 00:22:04.636 "abort": true, 00:22:04.636 "seek_hole": false, 00:22:04.636 "seek_data": false, 00:22:04.636 "copy": true, 00:22:04.636 "nvme_iov_md": false 00:22:04.636 }, 00:22:04.636 "memory_domains": [ 00:22:04.636 { 00:22:04.636 "dma_device_id": "system", 00:22:04.636 "dma_device_type": 1 00:22:04.636 }, 00:22:04.636 { 00:22:04.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.636 "dma_device_type": 2 00:22:04.636 } 00:22:04.636 ], 00:22:04.636 "driver_specific": {} 00:22:04.636 }' 00:22:04.636 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.636 07:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.636 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.636 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.636 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.636 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.636 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.636 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.908 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.908 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.908 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.908 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.908 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:05.179 [2024-07-25 07:27:37.505995] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.179 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:05.438 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.438 "name": "Existed_Raid", 00:22:05.438 "uuid": "07922a2f-c376-4f0f-9c80-4d732b1b8e9c", 00:22:05.438 "strip_size_kb": 0, 00:22:05.438 "state": "online", 00:22:05.438 "raid_level": "raid1", 00:22:05.438 "superblock": false, 00:22:05.438 "num_base_bdevs": 4, 00:22:05.438 "num_base_bdevs_discovered": 3, 00:22:05.438 "num_base_bdevs_operational": 3, 00:22:05.438 "base_bdevs_list": [ 00:22:05.438 { 00:22:05.438 "name": null, 00:22:05.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.438 "is_configured": false, 00:22:05.438 "data_offset": 0, 00:22:05.438 "data_size": 65536 00:22:05.438 }, 00:22:05.438 { 00:22:05.438 "name": "BaseBdev2", 00:22:05.438 "uuid": "235e5878-d832-4b45-8f3c-8f4da71d19c0", 00:22:05.438 "is_configured": true, 00:22:05.438 "data_offset": 0, 00:22:05.438 "data_size": 65536 00:22:05.438 }, 00:22:05.438 { 00:22:05.438 "name": "BaseBdev3", 00:22:05.438 "uuid": "bc76a967-3ea1-4f53-801b-fbcfe45f1ab0", 00:22:05.438 "is_configured": true, 00:22:05.438 "data_offset": 0, 00:22:05.438 "data_size": 65536 00:22:05.438 }, 00:22:05.438 { 00:22:05.438 "name": "BaseBdev4", 00:22:05.438 "uuid": "8f10355a-2ebb-4e22-8dba-21555aa4af99", 00:22:05.438 "is_configured": true, 00:22:05.438 "data_offset": 0, 00:22:05.438 "data_size": 65536 00:22:05.438 } 00:22:05.438 ] 00:22:05.438 }' 00:22:05.438 07:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.438 07:27:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.003 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:06.003 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:06.003 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.003 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:06.262 [2024-07-25 07:27:38.758335] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.262 07:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:06.520 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:06.520 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:06.520 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:06.778 [2024-07-25 07:27:39.225791] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:06.778 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:06.778 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:06.778 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.778 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:07.036 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:07.036 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:07.036 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:07.294 [2024-07-25 07:27:39.689208] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:07.294 [2024-07-25 07:27:39.689282] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:07.294 [2024-07-25 07:27:39.699614] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:07.294 [2024-07-25 07:27:39.699644] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:07.294 [2024-07-25 07:27:39.699655] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26967b0 name Existed_Raid, state offline 00:22:07.294 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:07.294 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:07.294 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.294 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:07.552 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:07.552 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:07.552 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:07.552 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:07.552 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:07.552 07:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:07.811 BaseBdev2 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:07.811 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:08.070 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:08.328 [ 00:22:08.328 { 00:22:08.328 "name": "BaseBdev2", 00:22:08.328 "aliases": [ 00:22:08.328 "6eb70ea1-3599-4e0b-8769-27b1e97a9e12" 00:22:08.328 ], 00:22:08.328 "product_name": "Malloc disk", 00:22:08.328 "block_size": 512, 00:22:08.328 "num_blocks": 65536, 00:22:08.328 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:08.328 "assigned_rate_limits": { 00:22:08.328 "rw_ios_per_sec": 0, 00:22:08.328 "rw_mbytes_per_sec": 0, 00:22:08.328 "r_mbytes_per_sec": 0, 00:22:08.328 "w_mbytes_per_sec": 0 00:22:08.328 }, 00:22:08.328 "claimed": false, 00:22:08.328 "zoned": false, 00:22:08.328 "supported_io_types": { 00:22:08.328 "read": true, 00:22:08.328 "write": true, 00:22:08.328 "unmap": true, 00:22:08.328 "flush": true, 00:22:08.328 "reset": true, 00:22:08.328 "nvme_admin": false, 00:22:08.328 "nvme_io": false, 00:22:08.328 "nvme_io_md": false, 00:22:08.328 "write_zeroes": true, 00:22:08.328 "zcopy": true, 00:22:08.328 "get_zone_info": false, 00:22:08.328 "zone_management": false, 00:22:08.328 "zone_append": false, 00:22:08.328 "compare": false, 00:22:08.328 "compare_and_write": false, 00:22:08.328 "abort": true, 00:22:08.328 "seek_hole": false, 00:22:08.328 "seek_data": false, 00:22:08.328 "copy": true, 00:22:08.328 "nvme_iov_md": false 00:22:08.328 }, 00:22:08.328 "memory_domains": [ 00:22:08.328 { 00:22:08.328 "dma_device_id": "system", 00:22:08.328 "dma_device_type": 1 00:22:08.328 }, 00:22:08.328 { 00:22:08.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:08.328 "dma_device_type": 2 00:22:08.328 } 00:22:08.328 ], 00:22:08.328 "driver_specific": {} 00:22:08.328 } 00:22:08.328 ] 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:08.328 BaseBdev3 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:08.328 07:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:08.586 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:08.844 [ 00:22:08.844 { 00:22:08.844 "name": "BaseBdev3", 00:22:08.844 "aliases": [ 00:22:08.844 "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c" 00:22:08.844 ], 00:22:08.844 "product_name": "Malloc disk", 00:22:08.844 "block_size": 512, 00:22:08.844 "num_blocks": 65536, 00:22:08.844 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:08.844 "assigned_rate_limits": { 00:22:08.844 "rw_ios_per_sec": 0, 00:22:08.844 "rw_mbytes_per_sec": 0, 00:22:08.844 "r_mbytes_per_sec": 0, 00:22:08.844 "w_mbytes_per_sec": 0 00:22:08.844 }, 00:22:08.844 "claimed": false, 00:22:08.844 "zoned": false, 00:22:08.844 "supported_io_types": { 00:22:08.844 "read": true, 00:22:08.844 "write": true, 00:22:08.844 "unmap": true, 00:22:08.844 "flush": true, 00:22:08.844 "reset": true, 00:22:08.844 "nvme_admin": false, 00:22:08.844 "nvme_io": false, 00:22:08.844 "nvme_io_md": false, 00:22:08.844 "write_zeroes": true, 00:22:08.844 "zcopy": true, 00:22:08.844 "get_zone_info": false, 00:22:08.844 "zone_management": false, 00:22:08.844 "zone_append": false, 00:22:08.844 "compare": false, 00:22:08.844 "compare_and_write": false, 00:22:08.844 "abort": true, 00:22:08.844 "seek_hole": false, 00:22:08.844 "seek_data": false, 00:22:08.844 "copy": true, 00:22:08.844 "nvme_iov_md": false 00:22:08.844 }, 00:22:08.844 "memory_domains": [ 00:22:08.844 { 00:22:08.844 "dma_device_id": "system", 00:22:08.844 "dma_device_type": 1 00:22:08.844 }, 00:22:08.844 { 00:22:08.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:08.844 "dma_device_type": 2 00:22:08.844 } 00:22:08.844 ], 00:22:08.844 "driver_specific": {} 00:22:08.844 } 00:22:08.844 ] 00:22:08.844 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:08.844 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:08.844 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:08.844 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:09.102 BaseBdev4 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:09.102 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:09.360 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:09.618 [ 00:22:09.618 { 00:22:09.618 "name": "BaseBdev4", 00:22:09.618 "aliases": [ 00:22:09.618 "5d438267-5ad0-40dc-a870-05d52e9fa61f" 00:22:09.618 ], 00:22:09.618 "product_name": "Malloc disk", 00:22:09.618 "block_size": 512, 00:22:09.618 "num_blocks": 65536, 00:22:09.618 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:09.618 "assigned_rate_limits": { 00:22:09.619 "rw_ios_per_sec": 0, 00:22:09.619 "rw_mbytes_per_sec": 0, 00:22:09.619 "r_mbytes_per_sec": 0, 00:22:09.619 "w_mbytes_per_sec": 0 00:22:09.619 }, 00:22:09.619 "claimed": false, 00:22:09.619 "zoned": false, 00:22:09.619 "supported_io_types": { 00:22:09.619 "read": true, 00:22:09.619 "write": true, 00:22:09.619 "unmap": true, 00:22:09.619 "flush": true, 00:22:09.619 "reset": true, 00:22:09.619 "nvme_admin": false, 00:22:09.619 "nvme_io": false, 00:22:09.619 "nvme_io_md": false, 00:22:09.619 "write_zeroes": true, 00:22:09.619 "zcopy": true, 00:22:09.619 "get_zone_info": false, 00:22:09.619 "zone_management": false, 00:22:09.619 "zone_append": false, 00:22:09.619 "compare": false, 00:22:09.619 "compare_and_write": false, 00:22:09.619 "abort": true, 00:22:09.619 "seek_hole": false, 00:22:09.619 "seek_data": false, 00:22:09.619 "copy": true, 00:22:09.619 "nvme_iov_md": false 00:22:09.619 }, 00:22:09.619 "memory_domains": [ 00:22:09.619 { 00:22:09.619 "dma_device_id": "system", 00:22:09.619 "dma_device_type": 1 00:22:09.619 }, 00:22:09.619 { 00:22:09.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:09.619 "dma_device_type": 2 00:22:09.619 } 00:22:09.619 ], 00:22:09.619 "driver_specific": {} 00:22:09.619 } 00:22:09.619 ] 00:22:09.619 07:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:09.619 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:09.619 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:09.619 07:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:09.877 [2024-07-25 07:27:42.186628] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:09.877 [2024-07-25 07:27:42.186669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:09.877 [2024-07-25 07:27:42.186687] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:09.878 [2024-07-25 07:27:42.187924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:09.878 [2024-07-25 07:27:42.187965] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.878 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:10.136 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.136 "name": "Existed_Raid", 00:22:10.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.136 "strip_size_kb": 0, 00:22:10.136 "state": "configuring", 00:22:10.136 "raid_level": "raid1", 00:22:10.136 "superblock": false, 00:22:10.136 "num_base_bdevs": 4, 00:22:10.136 "num_base_bdevs_discovered": 3, 00:22:10.136 "num_base_bdevs_operational": 4, 00:22:10.136 "base_bdevs_list": [ 00:22:10.136 { 00:22:10.136 "name": "BaseBdev1", 00:22:10.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.136 "is_configured": false, 00:22:10.136 "data_offset": 0, 00:22:10.136 "data_size": 0 00:22:10.136 }, 00:22:10.136 { 00:22:10.136 "name": "BaseBdev2", 00:22:10.136 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:10.136 "is_configured": true, 00:22:10.136 "data_offset": 0, 00:22:10.136 "data_size": 65536 00:22:10.136 }, 00:22:10.136 { 00:22:10.136 "name": "BaseBdev3", 00:22:10.136 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:10.136 "is_configured": true, 00:22:10.136 "data_offset": 0, 00:22:10.136 "data_size": 65536 00:22:10.136 }, 00:22:10.136 { 00:22:10.136 "name": "BaseBdev4", 00:22:10.136 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:10.136 "is_configured": true, 00:22:10.136 "data_offset": 0, 00:22:10.136 "data_size": 65536 00:22:10.136 } 00:22:10.136 ] 00:22:10.136 }' 00:22:10.136 07:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.136 07:27:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.702 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:10.702 [2024-07-25 07:27:43.217546] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:10.960 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.960 "name": "Existed_Raid", 00:22:10.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.960 "strip_size_kb": 0, 00:22:10.960 "state": "configuring", 00:22:10.960 "raid_level": "raid1", 00:22:10.960 "superblock": false, 00:22:10.960 "num_base_bdevs": 4, 00:22:10.960 "num_base_bdevs_discovered": 2, 00:22:10.960 "num_base_bdevs_operational": 4, 00:22:10.960 "base_bdevs_list": [ 00:22:10.960 { 00:22:10.960 "name": "BaseBdev1", 00:22:10.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.960 "is_configured": false, 00:22:10.960 "data_offset": 0, 00:22:10.960 "data_size": 0 00:22:10.960 }, 00:22:10.960 { 00:22:10.960 "name": null, 00:22:10.960 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:10.960 "is_configured": false, 00:22:10.960 "data_offset": 0, 00:22:10.960 "data_size": 65536 00:22:10.960 }, 00:22:10.960 { 00:22:10.960 "name": "BaseBdev3", 00:22:10.960 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:10.960 "is_configured": true, 00:22:10.960 "data_offset": 0, 00:22:10.960 "data_size": 65536 00:22:10.960 }, 00:22:10.960 { 00:22:10.960 "name": "BaseBdev4", 00:22:10.960 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:10.960 "is_configured": true, 00:22:10.961 "data_offset": 0, 00:22:10.961 "data_size": 65536 00:22:10.961 } 00:22:10.961 ] 00:22:10.961 }' 00:22:10.961 07:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.961 07:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:11.895 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.895 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:11.895 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:11.895 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:12.153 [2024-07-25 07:27:44.508100] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:12.153 BaseBdev1 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:12.153 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:12.412 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:12.670 [ 00:22:12.670 { 00:22:12.670 "name": "BaseBdev1", 00:22:12.670 "aliases": [ 00:22:12.670 "35768ea7-0eeb-4635-be0f-eadb5e72478b" 00:22:12.670 ], 00:22:12.670 "product_name": "Malloc disk", 00:22:12.670 "block_size": 512, 00:22:12.670 "num_blocks": 65536, 00:22:12.670 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:12.670 "assigned_rate_limits": { 00:22:12.670 "rw_ios_per_sec": 0, 00:22:12.670 "rw_mbytes_per_sec": 0, 00:22:12.670 "r_mbytes_per_sec": 0, 00:22:12.670 "w_mbytes_per_sec": 0 00:22:12.670 }, 00:22:12.670 "claimed": true, 00:22:12.670 "claim_type": "exclusive_write", 00:22:12.670 "zoned": false, 00:22:12.670 "supported_io_types": { 00:22:12.670 "read": true, 00:22:12.670 "write": true, 00:22:12.670 "unmap": true, 00:22:12.670 "flush": true, 00:22:12.670 "reset": true, 00:22:12.670 "nvme_admin": false, 00:22:12.670 "nvme_io": false, 00:22:12.670 "nvme_io_md": false, 00:22:12.670 "write_zeroes": true, 00:22:12.670 "zcopy": true, 00:22:12.670 "get_zone_info": false, 00:22:12.670 "zone_management": false, 00:22:12.670 "zone_append": false, 00:22:12.670 "compare": false, 00:22:12.670 "compare_and_write": false, 00:22:12.670 "abort": true, 00:22:12.670 "seek_hole": false, 00:22:12.670 "seek_data": false, 00:22:12.670 "copy": true, 00:22:12.670 "nvme_iov_md": false 00:22:12.670 }, 00:22:12.670 "memory_domains": [ 00:22:12.670 { 00:22:12.670 "dma_device_id": "system", 00:22:12.670 "dma_device_type": 1 00:22:12.670 }, 00:22:12.670 { 00:22:12.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.670 "dma_device_type": 2 00:22:12.670 } 00:22:12.670 ], 00:22:12.670 "driver_specific": {} 00:22:12.670 } 00:22:12.670 ] 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.670 07:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:12.929 07:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.929 "name": "Existed_Raid", 00:22:12.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.929 "strip_size_kb": 0, 00:22:12.929 "state": "configuring", 00:22:12.929 "raid_level": "raid1", 00:22:12.929 "superblock": false, 00:22:12.929 "num_base_bdevs": 4, 00:22:12.929 "num_base_bdevs_discovered": 3, 00:22:12.929 "num_base_bdevs_operational": 4, 00:22:12.929 "base_bdevs_list": [ 00:22:12.929 { 00:22:12.929 "name": "BaseBdev1", 00:22:12.929 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:12.929 "is_configured": true, 00:22:12.929 "data_offset": 0, 00:22:12.929 "data_size": 65536 00:22:12.929 }, 00:22:12.929 { 00:22:12.929 "name": null, 00:22:12.929 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:12.929 "is_configured": false, 00:22:12.929 "data_offset": 0, 00:22:12.929 "data_size": 65536 00:22:12.929 }, 00:22:12.929 { 00:22:12.929 "name": "BaseBdev3", 00:22:12.929 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:12.929 "is_configured": true, 00:22:12.929 "data_offset": 0, 00:22:12.929 "data_size": 65536 00:22:12.929 }, 00:22:12.929 { 00:22:12.929 "name": "BaseBdev4", 00:22:12.929 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:12.929 "is_configured": true, 00:22:12.929 "data_offset": 0, 00:22:12.929 "data_size": 65536 00:22:12.929 } 00:22:12.929 ] 00:22:12.929 }' 00:22:12.929 07:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.929 07:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:13.495 07:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.495 07:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:13.495 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:13.495 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:13.754 [2024-07-25 07:27:46.220626] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.754 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:14.014 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.014 "name": "Existed_Raid", 00:22:14.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.014 "strip_size_kb": 0, 00:22:14.014 "state": "configuring", 00:22:14.014 "raid_level": "raid1", 00:22:14.014 "superblock": false, 00:22:14.014 "num_base_bdevs": 4, 00:22:14.014 "num_base_bdevs_discovered": 2, 00:22:14.014 "num_base_bdevs_operational": 4, 00:22:14.014 "base_bdevs_list": [ 00:22:14.014 { 00:22:14.014 "name": "BaseBdev1", 00:22:14.014 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:14.014 "is_configured": true, 00:22:14.014 "data_offset": 0, 00:22:14.014 "data_size": 65536 00:22:14.014 }, 00:22:14.014 { 00:22:14.014 "name": null, 00:22:14.014 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:14.014 "is_configured": false, 00:22:14.014 "data_offset": 0, 00:22:14.014 "data_size": 65536 00:22:14.014 }, 00:22:14.014 { 00:22:14.014 "name": null, 00:22:14.014 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:14.014 "is_configured": false, 00:22:14.014 "data_offset": 0, 00:22:14.014 "data_size": 65536 00:22:14.014 }, 00:22:14.014 { 00:22:14.014 "name": "BaseBdev4", 00:22:14.014 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:14.014 "is_configured": true, 00:22:14.014 "data_offset": 0, 00:22:14.014 "data_size": 65536 00:22:14.014 } 00:22:14.014 ] 00:22:14.014 }' 00:22:14.014 07:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.014 07:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.580 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.580 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:14.838 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:14.838 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:15.096 [2024-07-25 07:27:47.487972] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.096 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.355 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.355 "name": "Existed_Raid", 00:22:15.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.355 "strip_size_kb": 0, 00:22:15.355 "state": "configuring", 00:22:15.355 "raid_level": "raid1", 00:22:15.355 "superblock": false, 00:22:15.355 "num_base_bdevs": 4, 00:22:15.355 "num_base_bdevs_discovered": 3, 00:22:15.355 "num_base_bdevs_operational": 4, 00:22:15.355 "base_bdevs_list": [ 00:22:15.355 { 00:22:15.355 "name": "BaseBdev1", 00:22:15.355 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:15.355 "is_configured": true, 00:22:15.355 "data_offset": 0, 00:22:15.355 "data_size": 65536 00:22:15.355 }, 00:22:15.355 { 00:22:15.355 "name": null, 00:22:15.355 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:15.355 "is_configured": false, 00:22:15.355 "data_offset": 0, 00:22:15.355 "data_size": 65536 00:22:15.355 }, 00:22:15.355 { 00:22:15.355 "name": "BaseBdev3", 00:22:15.355 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:15.355 "is_configured": true, 00:22:15.355 "data_offset": 0, 00:22:15.355 "data_size": 65536 00:22:15.355 }, 00:22:15.355 { 00:22:15.355 "name": "BaseBdev4", 00:22:15.355 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:15.355 "is_configured": true, 00:22:15.355 "data_offset": 0, 00:22:15.355 "data_size": 65536 00:22:15.355 } 00:22:15.355 ] 00:22:15.355 }' 00:22:15.355 07:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.355 07:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:15.949 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.949 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:16.207 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:16.207 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:16.207 [2024-07-25 07:27:48.723246] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:16.465 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.465 "name": "Existed_Raid", 00:22:16.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.465 "strip_size_kb": 0, 00:22:16.465 "state": "configuring", 00:22:16.465 "raid_level": "raid1", 00:22:16.465 "superblock": false, 00:22:16.465 "num_base_bdevs": 4, 00:22:16.465 "num_base_bdevs_discovered": 2, 00:22:16.465 "num_base_bdevs_operational": 4, 00:22:16.465 "base_bdevs_list": [ 00:22:16.465 { 00:22:16.465 "name": null, 00:22:16.466 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:16.466 "is_configured": false, 00:22:16.466 "data_offset": 0, 00:22:16.466 "data_size": 65536 00:22:16.466 }, 00:22:16.466 { 00:22:16.466 "name": null, 00:22:16.466 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:16.466 "is_configured": false, 00:22:16.466 "data_offset": 0, 00:22:16.466 "data_size": 65536 00:22:16.466 }, 00:22:16.466 { 00:22:16.466 "name": "BaseBdev3", 00:22:16.466 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:16.466 "is_configured": true, 00:22:16.466 "data_offset": 0, 00:22:16.466 "data_size": 65536 00:22:16.466 }, 00:22:16.466 { 00:22:16.466 "name": "BaseBdev4", 00:22:16.466 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:16.466 "is_configured": true, 00:22:16.466 "data_offset": 0, 00:22:16.466 "data_size": 65536 00:22:16.466 } 00:22:16.466 ] 00:22:16.466 }' 00:22:16.466 07:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.466 07:27:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:17.034 07:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.034 07:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:17.292 07:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:17.292 07:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:17.551 [2024-07-25 07:27:49.984690] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:17.551 07:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.551 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:17.809 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.809 "name": "Existed_Raid", 00:22:17.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.809 "strip_size_kb": 0, 00:22:17.809 "state": "configuring", 00:22:17.809 "raid_level": "raid1", 00:22:17.809 "superblock": false, 00:22:17.809 "num_base_bdevs": 4, 00:22:17.809 "num_base_bdevs_discovered": 3, 00:22:17.809 "num_base_bdevs_operational": 4, 00:22:17.809 "base_bdevs_list": [ 00:22:17.809 { 00:22:17.809 "name": null, 00:22:17.809 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:17.809 "is_configured": false, 00:22:17.809 "data_offset": 0, 00:22:17.809 "data_size": 65536 00:22:17.809 }, 00:22:17.809 { 00:22:17.809 "name": "BaseBdev2", 00:22:17.809 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:17.809 "is_configured": true, 00:22:17.809 "data_offset": 0, 00:22:17.809 "data_size": 65536 00:22:17.809 }, 00:22:17.809 { 00:22:17.809 "name": "BaseBdev3", 00:22:17.810 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:17.810 "is_configured": true, 00:22:17.810 "data_offset": 0, 00:22:17.810 "data_size": 65536 00:22:17.810 }, 00:22:17.810 { 00:22:17.810 "name": "BaseBdev4", 00:22:17.810 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:17.810 "is_configured": true, 00:22:17.810 "data_offset": 0, 00:22:17.810 "data_size": 65536 00:22:17.810 } 00:22:17.810 ] 00:22:17.810 }' 00:22:17.810 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.810 07:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.376 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.376 07:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:18.633 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:18.633 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:18.633 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.891 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 35768ea7-0eeb-4635-be0f-eadb5e72478b 00:22:19.149 [2024-07-25 07:27:51.435766] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:19.149 [2024-07-25 07:27:51.435803] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26964b0 00:22:19.149 [2024-07-25 07:27:51.435811] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:19.149 [2024-07-25 07:27:51.435992] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x268def0 00:22:19.149 [2024-07-25 07:27:51.436109] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26964b0 00:22:19.149 [2024-07-25 07:27:51.436118] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26964b0 00:22:19.149 [2024-07-25 07:27:51.436280] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:19.149 NewBaseBdev 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:19.149 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:19.408 [ 00:22:19.408 { 00:22:19.408 "name": "NewBaseBdev", 00:22:19.408 "aliases": [ 00:22:19.408 "35768ea7-0eeb-4635-be0f-eadb5e72478b" 00:22:19.408 ], 00:22:19.408 "product_name": "Malloc disk", 00:22:19.408 "block_size": 512, 00:22:19.408 "num_blocks": 65536, 00:22:19.408 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:19.408 "assigned_rate_limits": { 00:22:19.408 "rw_ios_per_sec": 0, 00:22:19.408 "rw_mbytes_per_sec": 0, 00:22:19.408 "r_mbytes_per_sec": 0, 00:22:19.408 "w_mbytes_per_sec": 0 00:22:19.408 }, 00:22:19.408 "claimed": true, 00:22:19.408 "claim_type": "exclusive_write", 00:22:19.408 "zoned": false, 00:22:19.408 "supported_io_types": { 00:22:19.408 "read": true, 00:22:19.408 "write": true, 00:22:19.408 "unmap": true, 00:22:19.408 "flush": true, 00:22:19.408 "reset": true, 00:22:19.408 "nvme_admin": false, 00:22:19.408 "nvme_io": false, 00:22:19.408 "nvme_io_md": false, 00:22:19.408 "write_zeroes": true, 00:22:19.408 "zcopy": true, 00:22:19.408 "get_zone_info": false, 00:22:19.408 "zone_management": false, 00:22:19.408 "zone_append": false, 00:22:19.408 "compare": false, 00:22:19.408 "compare_and_write": false, 00:22:19.408 "abort": true, 00:22:19.408 "seek_hole": false, 00:22:19.408 "seek_data": false, 00:22:19.408 "copy": true, 00:22:19.408 "nvme_iov_md": false 00:22:19.408 }, 00:22:19.408 "memory_domains": [ 00:22:19.408 { 00:22:19.408 "dma_device_id": "system", 00:22:19.408 "dma_device_type": 1 00:22:19.408 }, 00:22:19.408 { 00:22:19.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:19.408 "dma_device_type": 2 00:22:19.408 } 00:22:19.408 ], 00:22:19.408 "driver_specific": {} 00:22:19.408 } 00:22:19.408 ] 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.408 07:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.667 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.667 "name": "Existed_Raid", 00:22:19.667 "uuid": "149a3151-6f72-4327-8c1b-02c8e1fa3b94", 00:22:19.667 "strip_size_kb": 0, 00:22:19.667 "state": "online", 00:22:19.667 "raid_level": "raid1", 00:22:19.667 "superblock": false, 00:22:19.667 "num_base_bdevs": 4, 00:22:19.667 "num_base_bdevs_discovered": 4, 00:22:19.667 "num_base_bdevs_operational": 4, 00:22:19.667 "base_bdevs_list": [ 00:22:19.667 { 00:22:19.667 "name": "NewBaseBdev", 00:22:19.667 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:19.667 "is_configured": true, 00:22:19.667 "data_offset": 0, 00:22:19.667 "data_size": 65536 00:22:19.667 }, 00:22:19.667 { 00:22:19.667 "name": "BaseBdev2", 00:22:19.667 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:19.667 "is_configured": true, 00:22:19.667 "data_offset": 0, 00:22:19.667 "data_size": 65536 00:22:19.667 }, 00:22:19.667 { 00:22:19.667 "name": "BaseBdev3", 00:22:19.667 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:19.667 "is_configured": true, 00:22:19.667 "data_offset": 0, 00:22:19.667 "data_size": 65536 00:22:19.667 }, 00:22:19.667 { 00:22:19.667 "name": "BaseBdev4", 00:22:19.667 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:19.667 "is_configured": true, 00:22:19.667 "data_offset": 0, 00:22:19.667 "data_size": 65536 00:22:19.667 } 00:22:19.667 ] 00:22:19.667 }' 00:22:19.667 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.667 07:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:20.232 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:20.490 [2024-07-25 07:27:52.928012] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:20.490 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:20.490 "name": "Existed_Raid", 00:22:20.490 "aliases": [ 00:22:20.490 "149a3151-6f72-4327-8c1b-02c8e1fa3b94" 00:22:20.490 ], 00:22:20.490 "product_name": "Raid Volume", 00:22:20.490 "block_size": 512, 00:22:20.490 "num_blocks": 65536, 00:22:20.490 "uuid": "149a3151-6f72-4327-8c1b-02c8e1fa3b94", 00:22:20.490 "assigned_rate_limits": { 00:22:20.490 "rw_ios_per_sec": 0, 00:22:20.490 "rw_mbytes_per_sec": 0, 00:22:20.490 "r_mbytes_per_sec": 0, 00:22:20.490 "w_mbytes_per_sec": 0 00:22:20.490 }, 00:22:20.490 "claimed": false, 00:22:20.490 "zoned": false, 00:22:20.490 "supported_io_types": { 00:22:20.490 "read": true, 00:22:20.490 "write": true, 00:22:20.490 "unmap": false, 00:22:20.490 "flush": false, 00:22:20.491 "reset": true, 00:22:20.491 "nvme_admin": false, 00:22:20.491 "nvme_io": false, 00:22:20.491 "nvme_io_md": false, 00:22:20.491 "write_zeroes": true, 00:22:20.491 "zcopy": false, 00:22:20.491 "get_zone_info": false, 00:22:20.491 "zone_management": false, 00:22:20.491 "zone_append": false, 00:22:20.491 "compare": false, 00:22:20.491 "compare_and_write": false, 00:22:20.491 "abort": false, 00:22:20.491 "seek_hole": false, 00:22:20.491 "seek_data": false, 00:22:20.491 "copy": false, 00:22:20.491 "nvme_iov_md": false 00:22:20.491 }, 00:22:20.491 "memory_domains": [ 00:22:20.491 { 00:22:20.491 "dma_device_id": "system", 00:22:20.491 "dma_device_type": 1 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.491 "dma_device_type": 2 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "system", 00:22:20.491 "dma_device_type": 1 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.491 "dma_device_type": 2 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "system", 00:22:20.491 "dma_device_type": 1 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.491 "dma_device_type": 2 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "system", 00:22:20.491 "dma_device_type": 1 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.491 "dma_device_type": 2 00:22:20.491 } 00:22:20.491 ], 00:22:20.491 "driver_specific": { 00:22:20.491 "raid": { 00:22:20.491 "uuid": "149a3151-6f72-4327-8c1b-02c8e1fa3b94", 00:22:20.491 "strip_size_kb": 0, 00:22:20.491 "state": "online", 00:22:20.491 "raid_level": "raid1", 00:22:20.491 "superblock": false, 00:22:20.491 "num_base_bdevs": 4, 00:22:20.491 "num_base_bdevs_discovered": 4, 00:22:20.491 "num_base_bdevs_operational": 4, 00:22:20.491 "base_bdevs_list": [ 00:22:20.491 { 00:22:20.491 "name": "NewBaseBdev", 00:22:20.491 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:20.491 "is_configured": true, 00:22:20.491 "data_offset": 0, 00:22:20.491 "data_size": 65536 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "name": "BaseBdev2", 00:22:20.491 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:20.491 "is_configured": true, 00:22:20.491 "data_offset": 0, 00:22:20.491 "data_size": 65536 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "name": "BaseBdev3", 00:22:20.491 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:20.491 "is_configured": true, 00:22:20.491 "data_offset": 0, 00:22:20.491 "data_size": 65536 00:22:20.491 }, 00:22:20.491 { 00:22:20.491 "name": "BaseBdev4", 00:22:20.491 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:20.491 "is_configured": true, 00:22:20.491 "data_offset": 0, 00:22:20.491 "data_size": 65536 00:22:20.491 } 00:22:20.491 ] 00:22:20.491 } 00:22:20.491 } 00:22:20.491 }' 00:22:20.491 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:20.491 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:20.491 BaseBdev2 00:22:20.491 BaseBdev3 00:22:20.491 BaseBdev4' 00:22:20.491 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:20.491 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:20.491 07:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:20.748 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:20.748 "name": "NewBaseBdev", 00:22:20.748 "aliases": [ 00:22:20.748 "35768ea7-0eeb-4635-be0f-eadb5e72478b" 00:22:20.748 ], 00:22:20.748 "product_name": "Malloc disk", 00:22:20.748 "block_size": 512, 00:22:20.748 "num_blocks": 65536, 00:22:20.748 "uuid": "35768ea7-0eeb-4635-be0f-eadb5e72478b", 00:22:20.748 "assigned_rate_limits": { 00:22:20.748 "rw_ios_per_sec": 0, 00:22:20.748 "rw_mbytes_per_sec": 0, 00:22:20.748 "r_mbytes_per_sec": 0, 00:22:20.748 "w_mbytes_per_sec": 0 00:22:20.748 }, 00:22:20.748 "claimed": true, 00:22:20.748 "claim_type": "exclusive_write", 00:22:20.748 "zoned": false, 00:22:20.748 "supported_io_types": { 00:22:20.748 "read": true, 00:22:20.748 "write": true, 00:22:20.748 "unmap": true, 00:22:20.748 "flush": true, 00:22:20.748 "reset": true, 00:22:20.748 "nvme_admin": false, 00:22:20.748 "nvme_io": false, 00:22:20.748 "nvme_io_md": false, 00:22:20.748 "write_zeroes": true, 00:22:20.748 "zcopy": true, 00:22:20.748 "get_zone_info": false, 00:22:20.748 "zone_management": false, 00:22:20.748 "zone_append": false, 00:22:20.748 "compare": false, 00:22:20.748 "compare_and_write": false, 00:22:20.748 "abort": true, 00:22:20.748 "seek_hole": false, 00:22:20.748 "seek_data": false, 00:22:20.748 "copy": true, 00:22:20.748 "nvme_iov_md": false 00:22:20.748 }, 00:22:20.748 "memory_domains": [ 00:22:20.748 { 00:22:20.748 "dma_device_id": "system", 00:22:20.748 "dma_device_type": 1 00:22:20.748 }, 00:22:20.748 { 00:22:20.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.748 "dma_device_type": 2 00:22:20.748 } 00:22:20.748 ], 00:22:20.748 "driver_specific": {} 00:22:20.748 }' 00:22:20.748 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:20.748 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:21.005 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:21.263 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:21.263 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:21.263 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:21.263 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:21.263 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:21.263 "name": "BaseBdev2", 00:22:21.263 "aliases": [ 00:22:21.263 "6eb70ea1-3599-4e0b-8769-27b1e97a9e12" 00:22:21.263 ], 00:22:21.263 "product_name": "Malloc disk", 00:22:21.263 "block_size": 512, 00:22:21.263 "num_blocks": 65536, 00:22:21.263 "uuid": "6eb70ea1-3599-4e0b-8769-27b1e97a9e12", 00:22:21.263 "assigned_rate_limits": { 00:22:21.263 "rw_ios_per_sec": 0, 00:22:21.263 "rw_mbytes_per_sec": 0, 00:22:21.263 "r_mbytes_per_sec": 0, 00:22:21.263 "w_mbytes_per_sec": 0 00:22:21.263 }, 00:22:21.263 "claimed": true, 00:22:21.263 "claim_type": "exclusive_write", 00:22:21.263 "zoned": false, 00:22:21.263 "supported_io_types": { 00:22:21.263 "read": true, 00:22:21.263 "write": true, 00:22:21.263 "unmap": true, 00:22:21.263 "flush": true, 00:22:21.263 "reset": true, 00:22:21.263 "nvme_admin": false, 00:22:21.263 "nvme_io": false, 00:22:21.263 "nvme_io_md": false, 00:22:21.263 "write_zeroes": true, 00:22:21.263 "zcopy": true, 00:22:21.263 "get_zone_info": false, 00:22:21.263 "zone_management": false, 00:22:21.263 "zone_append": false, 00:22:21.263 "compare": false, 00:22:21.263 "compare_and_write": false, 00:22:21.263 "abort": true, 00:22:21.263 "seek_hole": false, 00:22:21.263 "seek_data": false, 00:22:21.263 "copy": true, 00:22:21.263 "nvme_iov_md": false 00:22:21.263 }, 00:22:21.263 "memory_domains": [ 00:22:21.263 { 00:22:21.263 "dma_device_id": "system", 00:22:21.263 "dma_device_type": 1 00:22:21.263 }, 00:22:21.263 { 00:22:21.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:21.263 "dma_device_type": 2 00:22:21.263 } 00:22:21.263 ], 00:22:21.263 "driver_specific": {} 00:22:21.263 }' 00:22:21.263 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:21.520 07:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:21.520 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:21.520 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:21.778 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:21.778 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:21.778 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:21.778 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:21.778 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:22.036 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:22.036 "name": "BaseBdev3", 00:22:22.036 "aliases": [ 00:22:22.036 "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c" 00:22:22.036 ], 00:22:22.036 "product_name": "Malloc disk", 00:22:22.036 "block_size": 512, 00:22:22.036 "num_blocks": 65536, 00:22:22.036 "uuid": "efe0fc11-b5a8-4f59-b9cc-2620d2f3655c", 00:22:22.036 "assigned_rate_limits": { 00:22:22.036 "rw_ios_per_sec": 0, 00:22:22.036 "rw_mbytes_per_sec": 0, 00:22:22.036 "r_mbytes_per_sec": 0, 00:22:22.036 "w_mbytes_per_sec": 0 00:22:22.036 }, 00:22:22.036 "claimed": true, 00:22:22.036 "claim_type": "exclusive_write", 00:22:22.036 "zoned": false, 00:22:22.036 "supported_io_types": { 00:22:22.036 "read": true, 00:22:22.036 "write": true, 00:22:22.036 "unmap": true, 00:22:22.036 "flush": true, 00:22:22.036 "reset": true, 00:22:22.036 "nvme_admin": false, 00:22:22.036 "nvme_io": false, 00:22:22.036 "nvme_io_md": false, 00:22:22.036 "write_zeroes": true, 00:22:22.036 "zcopy": true, 00:22:22.036 "get_zone_info": false, 00:22:22.036 "zone_management": false, 00:22:22.036 "zone_append": false, 00:22:22.036 "compare": false, 00:22:22.036 "compare_and_write": false, 00:22:22.036 "abort": true, 00:22:22.036 "seek_hole": false, 00:22:22.036 "seek_data": false, 00:22:22.036 "copy": true, 00:22:22.036 "nvme_iov_md": false 00:22:22.036 }, 00:22:22.036 "memory_domains": [ 00:22:22.036 { 00:22:22.036 "dma_device_id": "system", 00:22:22.036 "dma_device_type": 1 00:22:22.036 }, 00:22:22.036 { 00:22:22.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.036 "dma_device_type": 2 00:22:22.036 } 00:22:22.036 ], 00:22:22.037 "driver_specific": {} 00:22:22.037 }' 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:22.037 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:22.294 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:22.552 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:22.552 "name": "BaseBdev4", 00:22:22.552 "aliases": [ 00:22:22.552 "5d438267-5ad0-40dc-a870-05d52e9fa61f" 00:22:22.552 ], 00:22:22.552 "product_name": "Malloc disk", 00:22:22.552 "block_size": 512, 00:22:22.552 "num_blocks": 65536, 00:22:22.552 "uuid": "5d438267-5ad0-40dc-a870-05d52e9fa61f", 00:22:22.552 "assigned_rate_limits": { 00:22:22.552 "rw_ios_per_sec": 0, 00:22:22.552 "rw_mbytes_per_sec": 0, 00:22:22.552 "r_mbytes_per_sec": 0, 00:22:22.552 "w_mbytes_per_sec": 0 00:22:22.552 }, 00:22:22.552 "claimed": true, 00:22:22.552 "claim_type": "exclusive_write", 00:22:22.552 "zoned": false, 00:22:22.552 "supported_io_types": { 00:22:22.552 "read": true, 00:22:22.552 "write": true, 00:22:22.552 "unmap": true, 00:22:22.552 "flush": true, 00:22:22.552 "reset": true, 00:22:22.552 "nvme_admin": false, 00:22:22.552 "nvme_io": false, 00:22:22.552 "nvme_io_md": false, 00:22:22.552 "write_zeroes": true, 00:22:22.552 "zcopy": true, 00:22:22.552 "get_zone_info": false, 00:22:22.552 "zone_management": false, 00:22:22.552 "zone_append": false, 00:22:22.552 "compare": false, 00:22:22.552 "compare_and_write": false, 00:22:22.552 "abort": true, 00:22:22.552 "seek_hole": false, 00:22:22.552 "seek_data": false, 00:22:22.552 "copy": true, 00:22:22.552 "nvme_iov_md": false 00:22:22.552 }, 00:22:22.552 "memory_domains": [ 00:22:22.552 { 00:22:22.552 "dma_device_id": "system", 00:22:22.552 "dma_device_type": 1 00:22:22.552 }, 00:22:22.552 { 00:22:22.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.552 "dma_device_type": 2 00:22:22.552 } 00:22:22.552 ], 00:22:22.552 "driver_specific": {} 00:22:22.552 }' 00:22:22.552 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:22.552 07:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:22.552 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:22.552 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:22.552 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:22.810 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:23.068 [2024-07-25 07:27:55.490451] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:23.068 [2024-07-25 07:27:55.490478] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:23.068 [2024-07-25 07:27:55.490530] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:23.068 [2024-07-25 07:27:55.490776] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:23.068 [2024-07-25 07:27:55.490788] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26964b0 name Existed_Raid, state offline 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1695745 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1695745 ']' 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1695745 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1695745 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1695745' 00:22:23.068 killing process with pid 1695745 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1695745 00:22:23.068 [2024-07-25 07:27:55.557437] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:23.068 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1695745 00:22:23.068 [2024-07-25 07:27:55.587996] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:23.326 00:22:23.326 real 0m30.616s 00:22:23.326 user 0m56.147s 00:22:23.326 sys 0m5.599s 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.326 ************************************ 00:22:23.326 END TEST raid_state_function_test 00:22:23.326 ************************************ 00:22:23.326 07:27:55 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:22:23.326 07:27:55 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:22:23.326 07:27:55 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:23.326 07:27:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:23.326 ************************************ 00:22:23.326 START TEST raid_state_function_test_sb 00:22:23.326 ************************************ 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 true 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:23.326 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:23.327 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1701439 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1701439' 00:22:23.585 Process raid pid: 1701439 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1701439 /var/tmp/spdk-raid.sock 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1701439 ']' 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:23.585 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:23.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:23.586 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:23.586 07:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.586 [2024-07-25 07:27:55.917339] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:22:23.586 [2024-07-25 07:27:55.917396] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.0 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.1 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.2 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.3 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.4 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.5 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.6 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:01.7 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.0 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.1 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.2 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.3 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.4 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.5 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.6 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3d:02.7 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.0 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.1 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.2 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.3 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.4 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.5 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.6 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:01.7 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.0 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.1 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.2 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.3 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.4 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.5 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.6 cannot be used 00:22:23.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:23.586 EAL: Requested device 0000:3f:02.7 cannot be used 00:22:23.586 [2024-07-25 07:27:56.049626] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:23.844 [2024-07-25 07:27:56.136635] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:22:23.844 [2024-07-25 07:27:56.193557] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:23.844 [2024-07-25 07:27:56.193590] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:24.410 07:27:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:24.410 07:27:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:22:24.410 07:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:24.976 [2024-07-25 07:27:57.216025] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:24.976 [2024-07-25 07:27:57.216062] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:24.976 [2024-07-25 07:27:57.216071] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:24.976 [2024-07-25 07:27:57.216082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:24.976 [2024-07-25 07:27:57.216090] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:24.976 [2024-07-25 07:27:57.216100] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:24.976 [2024-07-25 07:27:57.216108] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:24.976 [2024-07-25 07:27:57.216118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.976 "name": "Existed_Raid", 00:22:24.976 "uuid": "6f404640-ac3b-4019-b59e-5ab2cc9454a3", 00:22:24.976 "strip_size_kb": 0, 00:22:24.976 "state": "configuring", 00:22:24.976 "raid_level": "raid1", 00:22:24.976 "superblock": true, 00:22:24.976 "num_base_bdevs": 4, 00:22:24.976 "num_base_bdevs_discovered": 0, 00:22:24.976 "num_base_bdevs_operational": 4, 00:22:24.976 "base_bdevs_list": [ 00:22:24.976 { 00:22:24.976 "name": "BaseBdev1", 00:22:24.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.976 "is_configured": false, 00:22:24.976 "data_offset": 0, 00:22:24.976 "data_size": 0 00:22:24.976 }, 00:22:24.976 { 00:22:24.976 "name": "BaseBdev2", 00:22:24.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.976 "is_configured": false, 00:22:24.976 "data_offset": 0, 00:22:24.976 "data_size": 0 00:22:24.976 }, 00:22:24.976 { 00:22:24.976 "name": "BaseBdev3", 00:22:24.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.976 "is_configured": false, 00:22:24.976 "data_offset": 0, 00:22:24.976 "data_size": 0 00:22:24.976 }, 00:22:24.976 { 00:22:24.976 "name": "BaseBdev4", 00:22:24.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.976 "is_configured": false, 00:22:24.976 "data_offset": 0, 00:22:24.976 "data_size": 0 00:22:24.976 } 00:22:24.976 ] 00:22:24.976 }' 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.976 07:27:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:25.910 07:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:26.168 [2024-07-25 07:27:58.535350] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:26.168 [2024-07-25 07:27:58.535375] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271fee0 name Existed_Raid, state configuring 00:22:26.168 07:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:26.426 [2024-07-25 07:27:58.763965] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:26.426 [2024-07-25 07:27:58.763990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:26.426 [2024-07-25 07:27:58.763999] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:26.426 [2024-07-25 07:27:58.764009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:26.426 [2024-07-25 07:27:58.764017] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:26.426 [2024-07-25 07:27:58.764028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:26.426 [2024-07-25 07:27:58.764036] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:26.426 [2024-07-25 07:27:58.764046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:26.426 07:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:26.684 [2024-07-25 07:27:59.001951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:26.684 BaseBdev1 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:26.684 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:26.942 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:26.942 [ 00:22:26.942 { 00:22:26.942 "name": "BaseBdev1", 00:22:26.942 "aliases": [ 00:22:26.942 "7359f8fd-1447-4eda-afde-a73d1182ce03" 00:22:26.942 ], 00:22:26.942 "product_name": "Malloc disk", 00:22:26.942 "block_size": 512, 00:22:26.942 "num_blocks": 65536, 00:22:26.942 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:26.942 "assigned_rate_limits": { 00:22:26.942 "rw_ios_per_sec": 0, 00:22:26.942 "rw_mbytes_per_sec": 0, 00:22:26.942 "r_mbytes_per_sec": 0, 00:22:26.942 "w_mbytes_per_sec": 0 00:22:26.942 }, 00:22:26.942 "claimed": true, 00:22:26.942 "claim_type": "exclusive_write", 00:22:26.942 "zoned": false, 00:22:26.942 "supported_io_types": { 00:22:26.942 "read": true, 00:22:26.942 "write": true, 00:22:26.942 "unmap": true, 00:22:26.942 "flush": true, 00:22:26.942 "reset": true, 00:22:26.942 "nvme_admin": false, 00:22:26.942 "nvme_io": false, 00:22:26.942 "nvme_io_md": false, 00:22:26.942 "write_zeroes": true, 00:22:26.942 "zcopy": true, 00:22:26.942 "get_zone_info": false, 00:22:26.942 "zone_management": false, 00:22:26.942 "zone_append": false, 00:22:26.942 "compare": false, 00:22:26.942 "compare_and_write": false, 00:22:26.942 "abort": true, 00:22:26.942 "seek_hole": false, 00:22:26.942 "seek_data": false, 00:22:26.942 "copy": true, 00:22:26.942 "nvme_iov_md": false 00:22:26.942 }, 00:22:26.942 "memory_domains": [ 00:22:26.942 { 00:22:26.942 "dma_device_id": "system", 00:22:26.942 "dma_device_type": 1 00:22:26.942 }, 00:22:26.942 { 00:22:26.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.942 "dma_device_type": 2 00:22:26.942 } 00:22:26.942 ], 00:22:26.942 "driver_specific": {} 00:22:26.942 } 00:22:26.942 ] 00:22:26.942 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.200 "name": "Existed_Raid", 00:22:27.200 "uuid": "d8c9474d-9d4c-4e9f-a67b-9c913c19ab2a", 00:22:27.200 "strip_size_kb": 0, 00:22:27.200 "state": "configuring", 00:22:27.200 "raid_level": "raid1", 00:22:27.200 "superblock": true, 00:22:27.200 "num_base_bdevs": 4, 00:22:27.200 "num_base_bdevs_discovered": 1, 00:22:27.200 "num_base_bdevs_operational": 4, 00:22:27.200 "base_bdevs_list": [ 00:22:27.200 { 00:22:27.200 "name": "BaseBdev1", 00:22:27.200 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:27.200 "is_configured": true, 00:22:27.200 "data_offset": 2048, 00:22:27.200 "data_size": 63488 00:22:27.200 }, 00:22:27.200 { 00:22:27.200 "name": "BaseBdev2", 00:22:27.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.200 "is_configured": false, 00:22:27.200 "data_offset": 0, 00:22:27.200 "data_size": 0 00:22:27.200 }, 00:22:27.200 { 00:22:27.200 "name": "BaseBdev3", 00:22:27.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.200 "is_configured": false, 00:22:27.200 "data_offset": 0, 00:22:27.200 "data_size": 0 00:22:27.200 }, 00:22:27.200 { 00:22:27.200 "name": "BaseBdev4", 00:22:27.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.200 "is_configured": false, 00:22:27.200 "data_offset": 0, 00:22:27.200 "data_size": 0 00:22:27.200 } 00:22:27.200 ] 00:22:27.200 }' 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.200 07:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:27.765 07:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:28.023 [2024-07-25 07:28:00.473807] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:28.023 [2024-07-25 07:28:00.473841] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271f750 name Existed_Raid, state configuring 00:22:28.023 07:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:28.589 [2024-07-25 07:28:00.971155] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:28.589 [2024-07-25 07:28:00.972550] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:28.589 [2024-07-25 07:28:00.972582] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:28.589 [2024-07-25 07:28:00.972592] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:28.589 [2024-07-25 07:28:00.972603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:28.589 [2024-07-25 07:28:00.972615] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:28.589 [2024-07-25 07:28:00.972625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.589 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:28.847 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.847 "name": "Existed_Raid", 00:22:28.847 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:28.847 "strip_size_kb": 0, 00:22:28.847 "state": "configuring", 00:22:28.847 "raid_level": "raid1", 00:22:28.847 "superblock": true, 00:22:28.847 "num_base_bdevs": 4, 00:22:28.847 "num_base_bdevs_discovered": 1, 00:22:28.847 "num_base_bdevs_operational": 4, 00:22:28.847 "base_bdevs_list": [ 00:22:28.847 { 00:22:28.847 "name": "BaseBdev1", 00:22:28.847 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:28.847 "is_configured": true, 00:22:28.847 "data_offset": 2048, 00:22:28.847 "data_size": 63488 00:22:28.847 }, 00:22:28.848 { 00:22:28.848 "name": "BaseBdev2", 00:22:28.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.848 "is_configured": false, 00:22:28.848 "data_offset": 0, 00:22:28.848 "data_size": 0 00:22:28.848 }, 00:22:28.848 { 00:22:28.848 "name": "BaseBdev3", 00:22:28.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.848 "is_configured": false, 00:22:28.848 "data_offset": 0, 00:22:28.848 "data_size": 0 00:22:28.848 }, 00:22:28.848 { 00:22:28.848 "name": "BaseBdev4", 00:22:28.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.848 "is_configured": false, 00:22:28.848 "data_offset": 0, 00:22:28.848 "data_size": 0 00:22:28.848 } 00:22:28.848 ] 00:22:28.848 }' 00:22:28.848 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.848 07:28:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:29.413 07:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:29.671 [2024-07-25 07:28:02.024962] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:29.671 BaseBdev2 00:22:29.671 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:29.672 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:29.672 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:29.672 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:29.672 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:29.672 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:29.672 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:30.238 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:30.238 [ 00:22:30.238 { 00:22:30.238 "name": "BaseBdev2", 00:22:30.238 "aliases": [ 00:22:30.238 "4ab6d43c-f37f-4f36-baee-7fda49e51f75" 00:22:30.238 ], 00:22:30.238 "product_name": "Malloc disk", 00:22:30.238 "block_size": 512, 00:22:30.238 "num_blocks": 65536, 00:22:30.238 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:30.238 "assigned_rate_limits": { 00:22:30.238 "rw_ios_per_sec": 0, 00:22:30.238 "rw_mbytes_per_sec": 0, 00:22:30.238 "r_mbytes_per_sec": 0, 00:22:30.238 "w_mbytes_per_sec": 0 00:22:30.238 }, 00:22:30.238 "claimed": true, 00:22:30.238 "claim_type": "exclusive_write", 00:22:30.238 "zoned": false, 00:22:30.238 "supported_io_types": { 00:22:30.238 "read": true, 00:22:30.238 "write": true, 00:22:30.238 "unmap": true, 00:22:30.238 "flush": true, 00:22:30.238 "reset": true, 00:22:30.238 "nvme_admin": false, 00:22:30.238 "nvme_io": false, 00:22:30.238 "nvme_io_md": false, 00:22:30.238 "write_zeroes": true, 00:22:30.238 "zcopy": true, 00:22:30.238 "get_zone_info": false, 00:22:30.238 "zone_management": false, 00:22:30.238 "zone_append": false, 00:22:30.238 "compare": false, 00:22:30.238 "compare_and_write": false, 00:22:30.238 "abort": true, 00:22:30.238 "seek_hole": false, 00:22:30.238 "seek_data": false, 00:22:30.238 "copy": true, 00:22:30.238 "nvme_iov_md": false 00:22:30.238 }, 00:22:30.238 "memory_domains": [ 00:22:30.238 { 00:22:30.238 "dma_device_id": "system", 00:22:30.238 "dma_device_type": 1 00:22:30.238 }, 00:22:30.238 { 00:22:30.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.238 "dma_device_type": 2 00:22:30.238 } 00:22:30.238 ], 00:22:30.238 "driver_specific": {} 00:22:30.238 } 00:22:30.238 ] 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.508 07:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:30.508 07:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.508 "name": "Existed_Raid", 00:22:30.508 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:30.508 "strip_size_kb": 0, 00:22:30.508 "state": "configuring", 00:22:30.508 "raid_level": "raid1", 00:22:30.508 "superblock": true, 00:22:30.508 "num_base_bdevs": 4, 00:22:30.508 "num_base_bdevs_discovered": 2, 00:22:30.508 "num_base_bdevs_operational": 4, 00:22:30.508 "base_bdevs_list": [ 00:22:30.508 { 00:22:30.508 "name": "BaseBdev1", 00:22:30.508 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:30.508 "is_configured": true, 00:22:30.508 "data_offset": 2048, 00:22:30.508 "data_size": 63488 00:22:30.508 }, 00:22:30.508 { 00:22:30.508 "name": "BaseBdev2", 00:22:30.508 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:30.508 "is_configured": true, 00:22:30.508 "data_offset": 2048, 00:22:30.508 "data_size": 63488 00:22:30.508 }, 00:22:30.508 { 00:22:30.508 "name": "BaseBdev3", 00:22:30.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.508 "is_configured": false, 00:22:30.508 "data_offset": 0, 00:22:30.508 "data_size": 0 00:22:30.508 }, 00:22:30.508 { 00:22:30.508 "name": "BaseBdev4", 00:22:30.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.508 "is_configured": false, 00:22:30.508 "data_offset": 0, 00:22:30.508 "data_size": 0 00:22:30.508 } 00:22:30.508 ] 00:22:30.508 }' 00:22:30.508 07:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.508 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:31.088 07:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:31.346 [2024-07-25 07:28:03.800912] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:31.346 BaseBdev3 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:31.346 07:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:31.604 07:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:31.862 [ 00:22:31.862 { 00:22:31.862 "name": "BaseBdev3", 00:22:31.862 "aliases": [ 00:22:31.862 "42f2d77d-2ee5-4ba3-b124-58b7e2db4441" 00:22:31.862 ], 00:22:31.862 "product_name": "Malloc disk", 00:22:31.862 "block_size": 512, 00:22:31.862 "num_blocks": 65536, 00:22:31.862 "uuid": "42f2d77d-2ee5-4ba3-b124-58b7e2db4441", 00:22:31.862 "assigned_rate_limits": { 00:22:31.862 "rw_ios_per_sec": 0, 00:22:31.862 "rw_mbytes_per_sec": 0, 00:22:31.862 "r_mbytes_per_sec": 0, 00:22:31.862 "w_mbytes_per_sec": 0 00:22:31.862 }, 00:22:31.862 "claimed": true, 00:22:31.862 "claim_type": "exclusive_write", 00:22:31.862 "zoned": false, 00:22:31.862 "supported_io_types": { 00:22:31.862 "read": true, 00:22:31.862 "write": true, 00:22:31.862 "unmap": true, 00:22:31.862 "flush": true, 00:22:31.862 "reset": true, 00:22:31.862 "nvme_admin": false, 00:22:31.862 "nvme_io": false, 00:22:31.862 "nvme_io_md": false, 00:22:31.862 "write_zeroes": true, 00:22:31.862 "zcopy": true, 00:22:31.862 "get_zone_info": false, 00:22:31.862 "zone_management": false, 00:22:31.862 "zone_append": false, 00:22:31.862 "compare": false, 00:22:31.862 "compare_and_write": false, 00:22:31.862 "abort": true, 00:22:31.862 "seek_hole": false, 00:22:31.862 "seek_data": false, 00:22:31.862 "copy": true, 00:22:31.862 "nvme_iov_md": false 00:22:31.862 }, 00:22:31.862 "memory_domains": [ 00:22:31.862 { 00:22:31.862 "dma_device_id": "system", 00:22:31.862 "dma_device_type": 1 00:22:31.862 }, 00:22:31.862 { 00:22:31.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.862 "dma_device_type": 2 00:22:31.862 } 00:22:31.862 ], 00:22:31.862 "driver_specific": {} 00:22:31.862 } 00:22:31.862 ] 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.862 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:32.120 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.120 "name": "Existed_Raid", 00:22:32.120 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:32.120 "strip_size_kb": 0, 00:22:32.120 "state": "configuring", 00:22:32.120 "raid_level": "raid1", 00:22:32.120 "superblock": true, 00:22:32.120 "num_base_bdevs": 4, 00:22:32.120 "num_base_bdevs_discovered": 3, 00:22:32.120 "num_base_bdevs_operational": 4, 00:22:32.120 "base_bdevs_list": [ 00:22:32.120 { 00:22:32.120 "name": "BaseBdev1", 00:22:32.120 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:32.120 "is_configured": true, 00:22:32.120 "data_offset": 2048, 00:22:32.120 "data_size": 63488 00:22:32.120 }, 00:22:32.120 { 00:22:32.120 "name": "BaseBdev2", 00:22:32.120 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:32.120 "is_configured": true, 00:22:32.120 "data_offset": 2048, 00:22:32.120 "data_size": 63488 00:22:32.120 }, 00:22:32.120 { 00:22:32.120 "name": "BaseBdev3", 00:22:32.120 "uuid": "42f2d77d-2ee5-4ba3-b124-58b7e2db4441", 00:22:32.120 "is_configured": true, 00:22:32.120 "data_offset": 2048, 00:22:32.120 "data_size": 63488 00:22:32.120 }, 00:22:32.120 { 00:22:32.120 "name": "BaseBdev4", 00:22:32.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.120 "is_configured": false, 00:22:32.120 "data_offset": 0, 00:22:32.120 "data_size": 0 00:22:32.120 } 00:22:32.120 ] 00:22:32.120 }' 00:22:32.120 07:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.120 07:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:32.685 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:32.943 [2024-07-25 07:28:05.300149] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:32.943 [2024-07-25 07:28:05.300299] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27207b0 00:22:32.943 [2024-07-25 07:28:05.300313] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:32.943 [2024-07-25 07:28:05.300470] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28d39d0 00:22:32.943 [2024-07-25 07:28:05.300590] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27207b0 00:22:32.943 [2024-07-25 07:28:05.300599] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x27207b0 00:22:32.943 [2024-07-25 07:28:05.300680] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.943 BaseBdev4 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:32.943 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:33.201 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:33.459 [ 00:22:33.459 { 00:22:33.459 "name": "BaseBdev4", 00:22:33.459 "aliases": [ 00:22:33.459 "5fb385fd-60f8-4ae9-b812-8968832c2018" 00:22:33.459 ], 00:22:33.459 "product_name": "Malloc disk", 00:22:33.459 "block_size": 512, 00:22:33.459 "num_blocks": 65536, 00:22:33.459 "uuid": "5fb385fd-60f8-4ae9-b812-8968832c2018", 00:22:33.459 "assigned_rate_limits": { 00:22:33.459 "rw_ios_per_sec": 0, 00:22:33.459 "rw_mbytes_per_sec": 0, 00:22:33.459 "r_mbytes_per_sec": 0, 00:22:33.459 "w_mbytes_per_sec": 0 00:22:33.459 }, 00:22:33.459 "claimed": true, 00:22:33.459 "claim_type": "exclusive_write", 00:22:33.459 "zoned": false, 00:22:33.459 "supported_io_types": { 00:22:33.459 "read": true, 00:22:33.459 "write": true, 00:22:33.459 "unmap": true, 00:22:33.459 "flush": true, 00:22:33.459 "reset": true, 00:22:33.459 "nvme_admin": false, 00:22:33.459 "nvme_io": false, 00:22:33.459 "nvme_io_md": false, 00:22:33.459 "write_zeroes": true, 00:22:33.459 "zcopy": true, 00:22:33.459 "get_zone_info": false, 00:22:33.459 "zone_management": false, 00:22:33.459 "zone_append": false, 00:22:33.459 "compare": false, 00:22:33.459 "compare_and_write": false, 00:22:33.459 "abort": true, 00:22:33.459 "seek_hole": false, 00:22:33.459 "seek_data": false, 00:22:33.459 "copy": true, 00:22:33.459 "nvme_iov_md": false 00:22:33.459 }, 00:22:33.459 "memory_domains": [ 00:22:33.459 { 00:22:33.459 "dma_device_id": "system", 00:22:33.459 "dma_device_type": 1 00:22:33.459 }, 00:22:33.459 { 00:22:33.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.459 "dma_device_type": 2 00:22:33.459 } 00:22:33.459 ], 00:22:33.459 "driver_specific": {} 00:22:33.459 } 00:22:33.459 ] 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.459 07:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.717 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.717 "name": "Existed_Raid", 00:22:33.717 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:33.717 "strip_size_kb": 0, 00:22:33.717 "state": "online", 00:22:33.717 "raid_level": "raid1", 00:22:33.717 "superblock": true, 00:22:33.717 "num_base_bdevs": 4, 00:22:33.717 "num_base_bdevs_discovered": 4, 00:22:33.717 "num_base_bdevs_operational": 4, 00:22:33.717 "base_bdevs_list": [ 00:22:33.717 { 00:22:33.717 "name": "BaseBdev1", 00:22:33.717 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:33.717 "is_configured": true, 00:22:33.717 "data_offset": 2048, 00:22:33.717 "data_size": 63488 00:22:33.717 }, 00:22:33.717 { 00:22:33.717 "name": "BaseBdev2", 00:22:33.717 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:33.717 "is_configured": true, 00:22:33.717 "data_offset": 2048, 00:22:33.717 "data_size": 63488 00:22:33.717 }, 00:22:33.717 { 00:22:33.717 "name": "BaseBdev3", 00:22:33.717 "uuid": "42f2d77d-2ee5-4ba3-b124-58b7e2db4441", 00:22:33.717 "is_configured": true, 00:22:33.717 "data_offset": 2048, 00:22:33.717 "data_size": 63488 00:22:33.717 }, 00:22:33.717 { 00:22:33.717 "name": "BaseBdev4", 00:22:33.717 "uuid": "5fb385fd-60f8-4ae9-b812-8968832c2018", 00:22:33.717 "is_configured": true, 00:22:33.717 "data_offset": 2048, 00:22:33.717 "data_size": 63488 00:22:33.717 } 00:22:33.717 ] 00:22:33.717 }' 00:22:33.717 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.717 07:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:34.283 [2024-07-25 07:28:06.780357] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:34.283 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:34.283 "name": "Existed_Raid", 00:22:34.283 "aliases": [ 00:22:34.283 "e8c30950-7386-442a-9342-b5c1ab99d459" 00:22:34.283 ], 00:22:34.283 "product_name": "Raid Volume", 00:22:34.283 "block_size": 512, 00:22:34.283 "num_blocks": 63488, 00:22:34.283 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:34.283 "assigned_rate_limits": { 00:22:34.283 "rw_ios_per_sec": 0, 00:22:34.283 "rw_mbytes_per_sec": 0, 00:22:34.283 "r_mbytes_per_sec": 0, 00:22:34.283 "w_mbytes_per_sec": 0 00:22:34.283 }, 00:22:34.283 "claimed": false, 00:22:34.283 "zoned": false, 00:22:34.283 "supported_io_types": { 00:22:34.283 "read": true, 00:22:34.283 "write": true, 00:22:34.283 "unmap": false, 00:22:34.283 "flush": false, 00:22:34.283 "reset": true, 00:22:34.283 "nvme_admin": false, 00:22:34.283 "nvme_io": false, 00:22:34.283 "nvme_io_md": false, 00:22:34.283 "write_zeroes": true, 00:22:34.283 "zcopy": false, 00:22:34.283 "get_zone_info": false, 00:22:34.283 "zone_management": false, 00:22:34.283 "zone_append": false, 00:22:34.283 "compare": false, 00:22:34.283 "compare_and_write": false, 00:22:34.283 "abort": false, 00:22:34.283 "seek_hole": false, 00:22:34.283 "seek_data": false, 00:22:34.283 "copy": false, 00:22:34.283 "nvme_iov_md": false 00:22:34.283 }, 00:22:34.283 "memory_domains": [ 00:22:34.283 { 00:22:34.283 "dma_device_id": "system", 00:22:34.283 "dma_device_type": 1 00:22:34.283 }, 00:22:34.283 { 00:22:34.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.283 "dma_device_type": 2 00:22:34.283 }, 00:22:34.283 { 00:22:34.283 "dma_device_id": "system", 00:22:34.283 "dma_device_type": 1 00:22:34.283 }, 00:22:34.283 { 00:22:34.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.283 "dma_device_type": 2 00:22:34.283 }, 00:22:34.283 { 00:22:34.283 "dma_device_id": "system", 00:22:34.283 "dma_device_type": 1 00:22:34.283 }, 00:22:34.283 { 00:22:34.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.283 "dma_device_type": 2 00:22:34.284 }, 00:22:34.284 { 00:22:34.284 "dma_device_id": "system", 00:22:34.284 "dma_device_type": 1 00:22:34.284 }, 00:22:34.284 { 00:22:34.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.284 "dma_device_type": 2 00:22:34.284 } 00:22:34.284 ], 00:22:34.284 "driver_specific": { 00:22:34.284 "raid": { 00:22:34.284 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:34.284 "strip_size_kb": 0, 00:22:34.284 "state": "online", 00:22:34.284 "raid_level": "raid1", 00:22:34.284 "superblock": true, 00:22:34.284 "num_base_bdevs": 4, 00:22:34.284 "num_base_bdevs_discovered": 4, 00:22:34.284 "num_base_bdevs_operational": 4, 00:22:34.284 "base_bdevs_list": [ 00:22:34.284 { 00:22:34.284 "name": "BaseBdev1", 00:22:34.284 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:34.284 "is_configured": true, 00:22:34.284 "data_offset": 2048, 00:22:34.284 "data_size": 63488 00:22:34.284 }, 00:22:34.284 { 00:22:34.284 "name": "BaseBdev2", 00:22:34.284 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:34.284 "is_configured": true, 00:22:34.284 "data_offset": 2048, 00:22:34.284 "data_size": 63488 00:22:34.284 }, 00:22:34.284 { 00:22:34.284 "name": "BaseBdev3", 00:22:34.284 "uuid": "42f2d77d-2ee5-4ba3-b124-58b7e2db4441", 00:22:34.284 "is_configured": true, 00:22:34.284 "data_offset": 2048, 00:22:34.284 "data_size": 63488 00:22:34.284 }, 00:22:34.284 { 00:22:34.284 "name": "BaseBdev4", 00:22:34.284 "uuid": "5fb385fd-60f8-4ae9-b812-8968832c2018", 00:22:34.284 "is_configured": true, 00:22:34.284 "data_offset": 2048, 00:22:34.284 "data_size": 63488 00:22:34.284 } 00:22:34.284 ] 00:22:34.284 } 00:22:34.284 } 00:22:34.284 }' 00:22:34.284 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:34.542 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:34.542 BaseBdev2 00:22:34.542 BaseBdev3 00:22:34.542 BaseBdev4' 00:22:34.542 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:34.542 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:34.542 07:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:34.542 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:34.542 "name": "BaseBdev1", 00:22:34.542 "aliases": [ 00:22:34.542 "7359f8fd-1447-4eda-afde-a73d1182ce03" 00:22:34.542 ], 00:22:34.542 "product_name": "Malloc disk", 00:22:34.542 "block_size": 512, 00:22:34.542 "num_blocks": 65536, 00:22:34.542 "uuid": "7359f8fd-1447-4eda-afde-a73d1182ce03", 00:22:34.542 "assigned_rate_limits": { 00:22:34.542 "rw_ios_per_sec": 0, 00:22:34.542 "rw_mbytes_per_sec": 0, 00:22:34.542 "r_mbytes_per_sec": 0, 00:22:34.542 "w_mbytes_per_sec": 0 00:22:34.542 }, 00:22:34.542 "claimed": true, 00:22:34.542 "claim_type": "exclusive_write", 00:22:34.542 "zoned": false, 00:22:34.542 "supported_io_types": { 00:22:34.542 "read": true, 00:22:34.542 "write": true, 00:22:34.542 "unmap": true, 00:22:34.542 "flush": true, 00:22:34.542 "reset": true, 00:22:34.542 "nvme_admin": false, 00:22:34.542 "nvme_io": false, 00:22:34.542 "nvme_io_md": false, 00:22:34.542 "write_zeroes": true, 00:22:34.542 "zcopy": true, 00:22:34.542 "get_zone_info": false, 00:22:34.542 "zone_management": false, 00:22:34.542 "zone_append": false, 00:22:34.542 "compare": false, 00:22:34.542 "compare_and_write": false, 00:22:34.542 "abort": true, 00:22:34.542 "seek_hole": false, 00:22:34.542 "seek_data": false, 00:22:34.542 "copy": true, 00:22:34.542 "nvme_iov_md": false 00:22:34.542 }, 00:22:34.542 "memory_domains": [ 00:22:34.542 { 00:22:34.542 "dma_device_id": "system", 00:22:34.542 "dma_device_type": 1 00:22:34.542 }, 00:22:34.542 { 00:22:34.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.542 "dma_device_type": 2 00:22:34.542 } 00:22:34.542 ], 00:22:34.542 "driver_specific": {} 00:22:34.542 }' 00:22:34.542 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:34.800 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:35.058 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:35.317 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:35.317 "name": "BaseBdev2", 00:22:35.317 "aliases": [ 00:22:35.317 "4ab6d43c-f37f-4f36-baee-7fda49e51f75" 00:22:35.317 ], 00:22:35.317 "product_name": "Malloc disk", 00:22:35.317 "block_size": 512, 00:22:35.317 "num_blocks": 65536, 00:22:35.317 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:35.317 "assigned_rate_limits": { 00:22:35.317 "rw_ios_per_sec": 0, 00:22:35.317 "rw_mbytes_per_sec": 0, 00:22:35.317 "r_mbytes_per_sec": 0, 00:22:35.317 "w_mbytes_per_sec": 0 00:22:35.317 }, 00:22:35.317 "claimed": true, 00:22:35.317 "claim_type": "exclusive_write", 00:22:35.317 "zoned": false, 00:22:35.317 "supported_io_types": { 00:22:35.317 "read": true, 00:22:35.317 "write": true, 00:22:35.317 "unmap": true, 00:22:35.317 "flush": true, 00:22:35.317 "reset": true, 00:22:35.317 "nvme_admin": false, 00:22:35.317 "nvme_io": false, 00:22:35.317 "nvme_io_md": false, 00:22:35.317 "write_zeroes": true, 00:22:35.317 "zcopy": true, 00:22:35.317 "get_zone_info": false, 00:22:35.317 "zone_management": false, 00:22:35.317 "zone_append": false, 00:22:35.317 "compare": false, 00:22:35.317 "compare_and_write": false, 00:22:35.317 "abort": true, 00:22:35.317 "seek_hole": false, 00:22:35.317 "seek_data": false, 00:22:35.317 "copy": true, 00:22:35.317 "nvme_iov_md": false 00:22:35.317 }, 00:22:35.317 "memory_domains": [ 00:22:35.317 { 00:22:35.317 "dma_device_id": "system", 00:22:35.317 "dma_device_type": 1 00:22:35.317 }, 00:22:35.317 { 00:22:35.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.317 "dma_device_type": 2 00:22:35.317 } 00:22:35.317 ], 00:22:35.317 "driver_specific": {} 00:22:35.317 }' 00:22:35.317 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.317 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.317 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:35.317 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.318 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.318 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:35.318 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:35.576 07:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:35.834 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:35.834 "name": "BaseBdev3", 00:22:35.834 "aliases": [ 00:22:35.834 "42f2d77d-2ee5-4ba3-b124-58b7e2db4441" 00:22:35.834 ], 00:22:35.834 "product_name": "Malloc disk", 00:22:35.834 "block_size": 512, 00:22:35.834 "num_blocks": 65536, 00:22:35.834 "uuid": "42f2d77d-2ee5-4ba3-b124-58b7e2db4441", 00:22:35.834 "assigned_rate_limits": { 00:22:35.834 "rw_ios_per_sec": 0, 00:22:35.834 "rw_mbytes_per_sec": 0, 00:22:35.834 "r_mbytes_per_sec": 0, 00:22:35.834 "w_mbytes_per_sec": 0 00:22:35.834 }, 00:22:35.834 "claimed": true, 00:22:35.834 "claim_type": "exclusive_write", 00:22:35.834 "zoned": false, 00:22:35.834 "supported_io_types": { 00:22:35.834 "read": true, 00:22:35.834 "write": true, 00:22:35.834 "unmap": true, 00:22:35.834 "flush": true, 00:22:35.834 "reset": true, 00:22:35.834 "nvme_admin": false, 00:22:35.834 "nvme_io": false, 00:22:35.834 "nvme_io_md": false, 00:22:35.834 "write_zeroes": true, 00:22:35.834 "zcopy": true, 00:22:35.834 "get_zone_info": false, 00:22:35.834 "zone_management": false, 00:22:35.834 "zone_append": false, 00:22:35.834 "compare": false, 00:22:35.834 "compare_and_write": false, 00:22:35.834 "abort": true, 00:22:35.834 "seek_hole": false, 00:22:35.834 "seek_data": false, 00:22:35.834 "copy": true, 00:22:35.834 "nvme_iov_md": false 00:22:35.834 }, 00:22:35.834 "memory_domains": [ 00:22:35.834 { 00:22:35.834 "dma_device_id": "system", 00:22:35.834 "dma_device_type": 1 00:22:35.834 }, 00:22:35.834 { 00:22:35.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.834 "dma_device_type": 2 00:22:35.834 } 00:22:35.834 ], 00:22:35.834 "driver_specific": {} 00:22:35.834 }' 00:22:35.834 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.834 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.834 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:35.834 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.834 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:36.092 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:36.351 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:36.351 "name": "BaseBdev4", 00:22:36.351 "aliases": [ 00:22:36.351 "5fb385fd-60f8-4ae9-b812-8968832c2018" 00:22:36.351 ], 00:22:36.351 "product_name": "Malloc disk", 00:22:36.351 "block_size": 512, 00:22:36.351 "num_blocks": 65536, 00:22:36.351 "uuid": "5fb385fd-60f8-4ae9-b812-8968832c2018", 00:22:36.351 "assigned_rate_limits": { 00:22:36.351 "rw_ios_per_sec": 0, 00:22:36.351 "rw_mbytes_per_sec": 0, 00:22:36.351 "r_mbytes_per_sec": 0, 00:22:36.351 "w_mbytes_per_sec": 0 00:22:36.351 }, 00:22:36.351 "claimed": true, 00:22:36.351 "claim_type": "exclusive_write", 00:22:36.351 "zoned": false, 00:22:36.351 "supported_io_types": { 00:22:36.351 "read": true, 00:22:36.351 "write": true, 00:22:36.351 "unmap": true, 00:22:36.351 "flush": true, 00:22:36.351 "reset": true, 00:22:36.351 "nvme_admin": false, 00:22:36.351 "nvme_io": false, 00:22:36.351 "nvme_io_md": false, 00:22:36.351 "write_zeroes": true, 00:22:36.351 "zcopy": true, 00:22:36.351 "get_zone_info": false, 00:22:36.351 "zone_management": false, 00:22:36.351 "zone_append": false, 00:22:36.351 "compare": false, 00:22:36.351 "compare_and_write": false, 00:22:36.351 "abort": true, 00:22:36.351 "seek_hole": false, 00:22:36.351 "seek_data": false, 00:22:36.351 "copy": true, 00:22:36.351 "nvme_iov_md": false 00:22:36.351 }, 00:22:36.351 "memory_domains": [ 00:22:36.351 { 00:22:36.351 "dma_device_id": "system", 00:22:36.351 "dma_device_type": 1 00:22:36.351 }, 00:22:36.351 { 00:22:36.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.351 "dma_device_type": 2 00:22:36.351 } 00:22:36.351 ], 00:22:36.351 "driver_specific": {} 00:22:36.351 }' 00:22:36.351 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.351 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.351 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:36.351 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.609 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.609 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:36.609 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.609 07:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.609 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:36.609 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.610 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.610 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:36.610 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:36.868 [2024-07-25 07:28:09.306774] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.868 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.127 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.127 "name": "Existed_Raid", 00:22:37.127 "uuid": "e8c30950-7386-442a-9342-b5c1ab99d459", 00:22:37.127 "strip_size_kb": 0, 00:22:37.127 "state": "online", 00:22:37.127 "raid_level": "raid1", 00:22:37.127 "superblock": true, 00:22:37.127 "num_base_bdevs": 4, 00:22:37.127 "num_base_bdevs_discovered": 3, 00:22:37.127 "num_base_bdevs_operational": 3, 00:22:37.127 "base_bdevs_list": [ 00:22:37.127 { 00:22:37.127 "name": null, 00:22:37.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.127 "is_configured": false, 00:22:37.127 "data_offset": 2048, 00:22:37.127 "data_size": 63488 00:22:37.127 }, 00:22:37.127 { 00:22:37.127 "name": "BaseBdev2", 00:22:37.127 "uuid": "4ab6d43c-f37f-4f36-baee-7fda49e51f75", 00:22:37.127 "is_configured": true, 00:22:37.127 "data_offset": 2048, 00:22:37.127 "data_size": 63488 00:22:37.127 }, 00:22:37.127 { 00:22:37.127 "name": "BaseBdev3", 00:22:37.127 "uuid": "42f2d77d-2ee5-4ba3-b124-58b7e2db4441", 00:22:37.127 "is_configured": true, 00:22:37.127 "data_offset": 2048, 00:22:37.127 "data_size": 63488 00:22:37.127 }, 00:22:37.127 { 00:22:37.127 "name": "BaseBdev4", 00:22:37.127 "uuid": "5fb385fd-60f8-4ae9-b812-8968832c2018", 00:22:37.127 "is_configured": true, 00:22:37.127 "data_offset": 2048, 00:22:37.127 "data_size": 63488 00:22:37.127 } 00:22:37.127 ] 00:22:37.127 }' 00:22:37.127 07:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.127 07:28:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:37.693 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:37.693 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:37.693 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.693 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:37.952 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:37.952 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:37.952 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:38.211 [2024-07-25 07:28:10.551029] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:38.211 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:38.211 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:38.211 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.211 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:38.469 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:38.469 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:38.469 07:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:38.727 [2024-07-25 07:28:11.022492] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:38.727 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:38.727 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:38.727 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.727 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:38.985 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:38.985 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:38.985 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:38.985 [2024-07-25 07:28:11.489687] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:38.985 [2024-07-25 07:28:11.489758] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:38.985 [2024-07-25 07:28:11.499868] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:38.985 [2024-07-25 07:28:11.499896] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:38.985 [2024-07-25 07:28:11.499907] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27207b0 name Existed_Raid, state offline 00:22:38.985 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:38.985 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:39.244 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:39.502 BaseBdev2 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:39.502 07:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:39.760 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:40.018 [ 00:22:40.018 { 00:22:40.018 "name": "BaseBdev2", 00:22:40.018 "aliases": [ 00:22:40.018 "5799e743-ba6e-45ff-b83f-707e09d7cf11" 00:22:40.018 ], 00:22:40.018 "product_name": "Malloc disk", 00:22:40.018 "block_size": 512, 00:22:40.018 "num_blocks": 65536, 00:22:40.018 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:40.018 "assigned_rate_limits": { 00:22:40.018 "rw_ios_per_sec": 0, 00:22:40.018 "rw_mbytes_per_sec": 0, 00:22:40.018 "r_mbytes_per_sec": 0, 00:22:40.018 "w_mbytes_per_sec": 0 00:22:40.018 }, 00:22:40.018 "claimed": false, 00:22:40.018 "zoned": false, 00:22:40.018 "supported_io_types": { 00:22:40.018 "read": true, 00:22:40.018 "write": true, 00:22:40.018 "unmap": true, 00:22:40.018 "flush": true, 00:22:40.018 "reset": true, 00:22:40.018 "nvme_admin": false, 00:22:40.018 "nvme_io": false, 00:22:40.018 "nvme_io_md": false, 00:22:40.018 "write_zeroes": true, 00:22:40.018 "zcopy": true, 00:22:40.018 "get_zone_info": false, 00:22:40.018 "zone_management": false, 00:22:40.018 "zone_append": false, 00:22:40.018 "compare": false, 00:22:40.018 "compare_and_write": false, 00:22:40.018 "abort": true, 00:22:40.018 "seek_hole": false, 00:22:40.018 "seek_data": false, 00:22:40.018 "copy": true, 00:22:40.018 "nvme_iov_md": false 00:22:40.018 }, 00:22:40.018 "memory_domains": [ 00:22:40.018 { 00:22:40.018 "dma_device_id": "system", 00:22:40.018 "dma_device_type": 1 00:22:40.018 }, 00:22:40.018 { 00:22:40.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.018 "dma_device_type": 2 00:22:40.018 } 00:22:40.018 ], 00:22:40.018 "driver_specific": {} 00:22:40.018 } 00:22:40.018 ] 00:22:40.018 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:40.018 07:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:40.018 07:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:40.018 07:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:40.276 BaseBdev3 00:22:40.276 07:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:40.276 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:40.276 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:40.276 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:40.276 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:40.277 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:40.277 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:40.535 07:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:40.794 [ 00:22:40.794 { 00:22:40.794 "name": "BaseBdev3", 00:22:40.794 "aliases": [ 00:22:40.794 "5971c5c1-aeb7-44fc-a0b6-1ea506682c34" 00:22:40.794 ], 00:22:40.794 "product_name": "Malloc disk", 00:22:40.794 "block_size": 512, 00:22:40.794 "num_blocks": 65536, 00:22:40.794 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:40.794 "assigned_rate_limits": { 00:22:40.794 "rw_ios_per_sec": 0, 00:22:40.794 "rw_mbytes_per_sec": 0, 00:22:40.794 "r_mbytes_per_sec": 0, 00:22:40.794 "w_mbytes_per_sec": 0 00:22:40.794 }, 00:22:40.794 "claimed": false, 00:22:40.794 "zoned": false, 00:22:40.794 "supported_io_types": { 00:22:40.794 "read": true, 00:22:40.794 "write": true, 00:22:40.794 "unmap": true, 00:22:40.794 "flush": true, 00:22:40.794 "reset": true, 00:22:40.794 "nvme_admin": false, 00:22:40.794 "nvme_io": false, 00:22:40.794 "nvme_io_md": false, 00:22:40.794 "write_zeroes": true, 00:22:40.794 "zcopy": true, 00:22:40.794 "get_zone_info": false, 00:22:40.794 "zone_management": false, 00:22:40.794 "zone_append": false, 00:22:40.794 "compare": false, 00:22:40.794 "compare_and_write": false, 00:22:40.794 "abort": true, 00:22:40.794 "seek_hole": false, 00:22:40.794 "seek_data": false, 00:22:40.794 "copy": true, 00:22:40.794 "nvme_iov_md": false 00:22:40.794 }, 00:22:40.794 "memory_domains": [ 00:22:40.794 { 00:22:40.794 "dma_device_id": "system", 00:22:40.794 "dma_device_type": 1 00:22:40.794 }, 00:22:40.794 { 00:22:40.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.794 "dma_device_type": 2 00:22:40.794 } 00:22:40.794 ], 00:22:40.794 "driver_specific": {} 00:22:40.794 } 00:22:40.794 ] 00:22:40.794 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:40.794 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:40.794 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:40.794 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:40.794 BaseBdev4 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:41.052 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:41.310 [ 00:22:41.310 { 00:22:41.310 "name": "BaseBdev4", 00:22:41.310 "aliases": [ 00:22:41.310 "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d" 00:22:41.310 ], 00:22:41.310 "product_name": "Malloc disk", 00:22:41.310 "block_size": 512, 00:22:41.310 "num_blocks": 65536, 00:22:41.310 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:41.310 "assigned_rate_limits": { 00:22:41.310 "rw_ios_per_sec": 0, 00:22:41.310 "rw_mbytes_per_sec": 0, 00:22:41.310 "r_mbytes_per_sec": 0, 00:22:41.310 "w_mbytes_per_sec": 0 00:22:41.310 }, 00:22:41.310 "claimed": false, 00:22:41.310 "zoned": false, 00:22:41.310 "supported_io_types": { 00:22:41.310 "read": true, 00:22:41.310 "write": true, 00:22:41.310 "unmap": true, 00:22:41.310 "flush": true, 00:22:41.310 "reset": true, 00:22:41.310 "nvme_admin": false, 00:22:41.310 "nvme_io": false, 00:22:41.310 "nvme_io_md": false, 00:22:41.310 "write_zeroes": true, 00:22:41.310 "zcopy": true, 00:22:41.310 "get_zone_info": false, 00:22:41.310 "zone_management": false, 00:22:41.310 "zone_append": false, 00:22:41.310 "compare": false, 00:22:41.310 "compare_and_write": false, 00:22:41.310 "abort": true, 00:22:41.310 "seek_hole": false, 00:22:41.310 "seek_data": false, 00:22:41.310 "copy": true, 00:22:41.310 "nvme_iov_md": false 00:22:41.310 }, 00:22:41.310 "memory_domains": [ 00:22:41.310 { 00:22:41.310 "dma_device_id": "system", 00:22:41.310 "dma_device_type": 1 00:22:41.310 }, 00:22:41.310 { 00:22:41.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.310 "dma_device_type": 2 00:22:41.310 } 00:22:41.310 ], 00:22:41.310 "driver_specific": {} 00:22:41.310 } 00:22:41.310 ] 00:22:41.310 07:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:41.310 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:41.310 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:41.310 07:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:41.569 [2024-07-25 07:28:13.994875] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:41.569 [2024-07-25 07:28:13.994911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:41.569 [2024-07-25 07:28:13.994929] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:41.569 [2024-07-25 07:28:13.996164] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:41.569 [2024-07-25 07:28:13.996203] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.569 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:41.827 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.827 "name": "Existed_Raid", 00:22:41.827 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:41.827 "strip_size_kb": 0, 00:22:41.827 "state": "configuring", 00:22:41.827 "raid_level": "raid1", 00:22:41.827 "superblock": true, 00:22:41.827 "num_base_bdevs": 4, 00:22:41.827 "num_base_bdevs_discovered": 3, 00:22:41.827 "num_base_bdevs_operational": 4, 00:22:41.827 "base_bdevs_list": [ 00:22:41.827 { 00:22:41.827 "name": "BaseBdev1", 00:22:41.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.827 "is_configured": false, 00:22:41.827 "data_offset": 0, 00:22:41.827 "data_size": 0 00:22:41.827 }, 00:22:41.827 { 00:22:41.827 "name": "BaseBdev2", 00:22:41.827 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:41.827 "is_configured": true, 00:22:41.827 "data_offset": 2048, 00:22:41.827 "data_size": 63488 00:22:41.827 }, 00:22:41.827 { 00:22:41.827 "name": "BaseBdev3", 00:22:41.827 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:41.827 "is_configured": true, 00:22:41.827 "data_offset": 2048, 00:22:41.827 "data_size": 63488 00:22:41.827 }, 00:22:41.827 { 00:22:41.827 "name": "BaseBdev4", 00:22:41.827 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:41.827 "is_configured": true, 00:22:41.827 "data_offset": 2048, 00:22:41.827 "data_size": 63488 00:22:41.827 } 00:22:41.827 ] 00:22:41.827 }' 00:22:41.827 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.827 07:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.394 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:42.652 [2024-07-25 07:28:14.961384] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.652 07:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:42.911 07:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.911 "name": "Existed_Raid", 00:22:42.911 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:42.911 "strip_size_kb": 0, 00:22:42.911 "state": "configuring", 00:22:42.911 "raid_level": "raid1", 00:22:42.911 "superblock": true, 00:22:42.911 "num_base_bdevs": 4, 00:22:42.911 "num_base_bdevs_discovered": 2, 00:22:42.911 "num_base_bdevs_operational": 4, 00:22:42.911 "base_bdevs_list": [ 00:22:42.911 { 00:22:42.911 "name": "BaseBdev1", 00:22:42.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.911 "is_configured": false, 00:22:42.911 "data_offset": 0, 00:22:42.911 "data_size": 0 00:22:42.911 }, 00:22:42.911 { 00:22:42.911 "name": null, 00:22:42.911 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:42.911 "is_configured": false, 00:22:42.911 "data_offset": 2048, 00:22:42.911 "data_size": 63488 00:22:42.911 }, 00:22:42.911 { 00:22:42.911 "name": "BaseBdev3", 00:22:42.911 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:42.911 "is_configured": true, 00:22:42.911 "data_offset": 2048, 00:22:42.911 "data_size": 63488 00:22:42.911 }, 00:22:42.911 { 00:22:42.911 "name": "BaseBdev4", 00:22:42.911 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:42.911 "is_configured": true, 00:22:42.911 "data_offset": 2048, 00:22:42.911 "data_size": 63488 00:22:42.911 } 00:22:42.911 ] 00:22:42.911 }' 00:22:42.911 07:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.911 07:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:43.510 07:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:43.510 07:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.510 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:43.510 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:43.768 [2024-07-25 07:28:16.227947] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:43.768 BaseBdev1 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:43.768 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:44.026 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:44.284 [ 00:22:44.284 { 00:22:44.284 "name": "BaseBdev1", 00:22:44.284 "aliases": [ 00:22:44.284 "22852a07-641f-4464-b1ef-f8455c5ef003" 00:22:44.284 ], 00:22:44.284 "product_name": "Malloc disk", 00:22:44.284 "block_size": 512, 00:22:44.284 "num_blocks": 65536, 00:22:44.284 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:44.284 "assigned_rate_limits": { 00:22:44.284 "rw_ios_per_sec": 0, 00:22:44.284 "rw_mbytes_per_sec": 0, 00:22:44.284 "r_mbytes_per_sec": 0, 00:22:44.284 "w_mbytes_per_sec": 0 00:22:44.284 }, 00:22:44.284 "claimed": true, 00:22:44.284 "claim_type": "exclusive_write", 00:22:44.284 "zoned": false, 00:22:44.284 "supported_io_types": { 00:22:44.284 "read": true, 00:22:44.284 "write": true, 00:22:44.284 "unmap": true, 00:22:44.284 "flush": true, 00:22:44.284 "reset": true, 00:22:44.284 "nvme_admin": false, 00:22:44.284 "nvme_io": false, 00:22:44.284 "nvme_io_md": false, 00:22:44.284 "write_zeroes": true, 00:22:44.284 "zcopy": true, 00:22:44.284 "get_zone_info": false, 00:22:44.284 "zone_management": false, 00:22:44.284 "zone_append": false, 00:22:44.284 "compare": false, 00:22:44.284 "compare_and_write": false, 00:22:44.284 "abort": true, 00:22:44.284 "seek_hole": false, 00:22:44.284 "seek_data": false, 00:22:44.284 "copy": true, 00:22:44.284 "nvme_iov_md": false 00:22:44.284 }, 00:22:44.284 "memory_domains": [ 00:22:44.284 { 00:22:44.284 "dma_device_id": "system", 00:22:44.284 "dma_device_type": 1 00:22:44.284 }, 00:22:44.284 { 00:22:44.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.284 "dma_device_type": 2 00:22:44.284 } 00:22:44.284 ], 00:22:44.284 "driver_specific": {} 00:22:44.284 } 00:22:44.284 ] 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.284 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:44.542 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.542 "name": "Existed_Raid", 00:22:44.542 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:44.542 "strip_size_kb": 0, 00:22:44.542 "state": "configuring", 00:22:44.542 "raid_level": "raid1", 00:22:44.542 "superblock": true, 00:22:44.542 "num_base_bdevs": 4, 00:22:44.542 "num_base_bdevs_discovered": 3, 00:22:44.542 "num_base_bdevs_operational": 4, 00:22:44.542 "base_bdevs_list": [ 00:22:44.542 { 00:22:44.542 "name": "BaseBdev1", 00:22:44.542 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:44.542 "is_configured": true, 00:22:44.542 "data_offset": 2048, 00:22:44.542 "data_size": 63488 00:22:44.542 }, 00:22:44.542 { 00:22:44.542 "name": null, 00:22:44.542 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:44.542 "is_configured": false, 00:22:44.542 "data_offset": 2048, 00:22:44.542 "data_size": 63488 00:22:44.542 }, 00:22:44.542 { 00:22:44.542 "name": "BaseBdev3", 00:22:44.542 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:44.542 "is_configured": true, 00:22:44.542 "data_offset": 2048, 00:22:44.542 "data_size": 63488 00:22:44.542 }, 00:22:44.542 { 00:22:44.542 "name": "BaseBdev4", 00:22:44.542 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:44.542 "is_configured": true, 00:22:44.542 "data_offset": 2048, 00:22:44.542 "data_size": 63488 00:22:44.542 } 00:22:44.542 ] 00:22:44.542 }' 00:22:44.542 07:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.542 07:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.107 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.107 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:45.365 [2024-07-25 07:28:17.868304] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.365 07:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:45.623 07:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:45.623 "name": "Existed_Raid", 00:22:45.623 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:45.623 "strip_size_kb": 0, 00:22:45.623 "state": "configuring", 00:22:45.623 "raid_level": "raid1", 00:22:45.623 "superblock": true, 00:22:45.623 "num_base_bdevs": 4, 00:22:45.623 "num_base_bdevs_discovered": 2, 00:22:45.623 "num_base_bdevs_operational": 4, 00:22:45.623 "base_bdevs_list": [ 00:22:45.623 { 00:22:45.623 "name": "BaseBdev1", 00:22:45.623 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:45.623 "is_configured": true, 00:22:45.623 "data_offset": 2048, 00:22:45.623 "data_size": 63488 00:22:45.623 }, 00:22:45.623 { 00:22:45.623 "name": null, 00:22:45.623 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:45.623 "is_configured": false, 00:22:45.623 "data_offset": 2048, 00:22:45.623 "data_size": 63488 00:22:45.623 }, 00:22:45.623 { 00:22:45.623 "name": null, 00:22:45.623 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:45.623 "is_configured": false, 00:22:45.623 "data_offset": 2048, 00:22:45.623 "data_size": 63488 00:22:45.623 }, 00:22:45.623 { 00:22:45.623 "name": "BaseBdev4", 00:22:45.623 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:45.623 "is_configured": true, 00:22:45.623 "data_offset": 2048, 00:22:45.623 "data_size": 63488 00:22:45.623 } 00:22:45.623 ] 00:22:45.623 }' 00:22:45.623 07:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:45.623 07:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:46.189 07:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.189 07:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:46.446 07:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:46.447 07:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:46.705 [2024-07-25 07:28:19.139658] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.705 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:46.963 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.963 "name": "Existed_Raid", 00:22:46.963 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:46.963 "strip_size_kb": 0, 00:22:46.963 "state": "configuring", 00:22:46.963 "raid_level": "raid1", 00:22:46.963 "superblock": true, 00:22:46.963 "num_base_bdevs": 4, 00:22:46.963 "num_base_bdevs_discovered": 3, 00:22:46.963 "num_base_bdevs_operational": 4, 00:22:46.963 "base_bdevs_list": [ 00:22:46.963 { 00:22:46.963 "name": "BaseBdev1", 00:22:46.963 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:46.963 "is_configured": true, 00:22:46.963 "data_offset": 2048, 00:22:46.963 "data_size": 63488 00:22:46.963 }, 00:22:46.963 { 00:22:46.963 "name": null, 00:22:46.963 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:46.963 "is_configured": false, 00:22:46.963 "data_offset": 2048, 00:22:46.963 "data_size": 63488 00:22:46.963 }, 00:22:46.963 { 00:22:46.963 "name": "BaseBdev3", 00:22:46.963 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:46.963 "is_configured": true, 00:22:46.963 "data_offset": 2048, 00:22:46.963 "data_size": 63488 00:22:46.963 }, 00:22:46.963 { 00:22:46.963 "name": "BaseBdev4", 00:22:46.963 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:46.963 "is_configured": true, 00:22:46.963 "data_offset": 2048, 00:22:46.963 "data_size": 63488 00:22:46.963 } 00:22:46.963 ] 00:22:46.963 }' 00:22:46.963 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.963 07:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:47.529 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.529 07:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:47.787 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:47.787 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:48.045 [2024-07-25 07:28:20.411026] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.045 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:48.304 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.304 "name": "Existed_Raid", 00:22:48.304 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:48.304 "strip_size_kb": 0, 00:22:48.304 "state": "configuring", 00:22:48.304 "raid_level": "raid1", 00:22:48.304 "superblock": true, 00:22:48.304 "num_base_bdevs": 4, 00:22:48.304 "num_base_bdevs_discovered": 2, 00:22:48.304 "num_base_bdevs_operational": 4, 00:22:48.304 "base_bdevs_list": [ 00:22:48.304 { 00:22:48.304 "name": null, 00:22:48.304 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:48.304 "is_configured": false, 00:22:48.304 "data_offset": 2048, 00:22:48.304 "data_size": 63488 00:22:48.304 }, 00:22:48.304 { 00:22:48.304 "name": null, 00:22:48.304 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:48.304 "is_configured": false, 00:22:48.304 "data_offset": 2048, 00:22:48.304 "data_size": 63488 00:22:48.304 }, 00:22:48.304 { 00:22:48.304 "name": "BaseBdev3", 00:22:48.304 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:48.304 "is_configured": true, 00:22:48.304 "data_offset": 2048, 00:22:48.304 "data_size": 63488 00:22:48.304 }, 00:22:48.304 { 00:22:48.304 "name": "BaseBdev4", 00:22:48.304 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:48.304 "is_configured": true, 00:22:48.304 "data_offset": 2048, 00:22:48.304 "data_size": 63488 00:22:48.304 } 00:22:48.304 ] 00:22:48.304 }' 00:22:48.304 07:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.304 07:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:48.870 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.870 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:48.870 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:48.870 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:49.128 [2024-07-25 07:28:21.608199] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.128 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:49.385 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.385 "name": "Existed_Raid", 00:22:49.385 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:49.385 "strip_size_kb": 0, 00:22:49.385 "state": "configuring", 00:22:49.385 "raid_level": "raid1", 00:22:49.385 "superblock": true, 00:22:49.385 "num_base_bdevs": 4, 00:22:49.385 "num_base_bdevs_discovered": 3, 00:22:49.385 "num_base_bdevs_operational": 4, 00:22:49.385 "base_bdevs_list": [ 00:22:49.385 { 00:22:49.385 "name": null, 00:22:49.385 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:49.385 "is_configured": false, 00:22:49.385 "data_offset": 2048, 00:22:49.385 "data_size": 63488 00:22:49.385 }, 00:22:49.385 { 00:22:49.385 "name": "BaseBdev2", 00:22:49.385 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:49.385 "is_configured": true, 00:22:49.385 "data_offset": 2048, 00:22:49.385 "data_size": 63488 00:22:49.385 }, 00:22:49.385 { 00:22:49.385 "name": "BaseBdev3", 00:22:49.385 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:49.385 "is_configured": true, 00:22:49.385 "data_offset": 2048, 00:22:49.385 "data_size": 63488 00:22:49.385 }, 00:22:49.385 { 00:22:49.385 "name": "BaseBdev4", 00:22:49.385 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:49.385 "is_configured": true, 00:22:49.385 "data_offset": 2048, 00:22:49.385 "data_size": 63488 00:22:49.385 } 00:22:49.385 ] 00:22:49.385 }' 00:22:49.385 07:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.385 07:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:49.951 07:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.951 07:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:50.209 07:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:50.209 07:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:50.209 07:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.467 07:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 22852a07-641f-4464-b1ef-f8455c5ef003 00:22:50.725 [2024-07-25 07:28:23.107353] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:50.725 [2024-07-25 07:28:23.107496] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2717a30 00:22:50.725 [2024-07-25 07:28:23.107509] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:50.725 [2024-07-25 07:28:23.107662] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x270c6f0 00:22:50.725 [2024-07-25 07:28:23.107772] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2717a30 00:22:50.725 [2024-07-25 07:28:23.107781] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2717a30 00:22:50.725 [2024-07-25 07:28:23.107862] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:50.725 NewBaseBdev 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:50.725 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:50.983 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:51.242 [ 00:22:51.242 { 00:22:51.242 "name": "NewBaseBdev", 00:22:51.242 "aliases": [ 00:22:51.242 "22852a07-641f-4464-b1ef-f8455c5ef003" 00:22:51.242 ], 00:22:51.242 "product_name": "Malloc disk", 00:22:51.242 "block_size": 512, 00:22:51.242 "num_blocks": 65536, 00:22:51.242 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:51.242 "assigned_rate_limits": { 00:22:51.242 "rw_ios_per_sec": 0, 00:22:51.242 "rw_mbytes_per_sec": 0, 00:22:51.242 "r_mbytes_per_sec": 0, 00:22:51.242 "w_mbytes_per_sec": 0 00:22:51.242 }, 00:22:51.242 "claimed": true, 00:22:51.242 "claim_type": "exclusive_write", 00:22:51.242 "zoned": false, 00:22:51.242 "supported_io_types": { 00:22:51.242 "read": true, 00:22:51.242 "write": true, 00:22:51.242 "unmap": true, 00:22:51.242 "flush": true, 00:22:51.242 "reset": true, 00:22:51.242 "nvme_admin": false, 00:22:51.242 "nvme_io": false, 00:22:51.242 "nvme_io_md": false, 00:22:51.242 "write_zeroes": true, 00:22:51.242 "zcopy": true, 00:22:51.242 "get_zone_info": false, 00:22:51.242 "zone_management": false, 00:22:51.242 "zone_append": false, 00:22:51.242 "compare": false, 00:22:51.242 "compare_and_write": false, 00:22:51.242 "abort": true, 00:22:51.242 "seek_hole": false, 00:22:51.242 "seek_data": false, 00:22:51.242 "copy": true, 00:22:51.242 "nvme_iov_md": false 00:22:51.242 }, 00:22:51.242 "memory_domains": [ 00:22:51.242 { 00:22:51.242 "dma_device_id": "system", 00:22:51.242 "dma_device_type": 1 00:22:51.242 }, 00:22:51.242 { 00:22:51.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.242 "dma_device_type": 2 00:22:51.242 } 00:22:51.242 ], 00:22:51.242 "driver_specific": {} 00:22:51.242 } 00:22:51.242 ] 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.242 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:51.500 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:51.500 "name": "Existed_Raid", 00:22:51.500 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:51.500 "strip_size_kb": 0, 00:22:51.500 "state": "online", 00:22:51.500 "raid_level": "raid1", 00:22:51.500 "superblock": true, 00:22:51.500 "num_base_bdevs": 4, 00:22:51.500 "num_base_bdevs_discovered": 4, 00:22:51.500 "num_base_bdevs_operational": 4, 00:22:51.500 "base_bdevs_list": [ 00:22:51.500 { 00:22:51.500 "name": "NewBaseBdev", 00:22:51.500 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:51.500 "is_configured": true, 00:22:51.500 "data_offset": 2048, 00:22:51.500 "data_size": 63488 00:22:51.500 }, 00:22:51.500 { 00:22:51.500 "name": "BaseBdev2", 00:22:51.500 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:51.500 "is_configured": true, 00:22:51.500 "data_offset": 2048, 00:22:51.500 "data_size": 63488 00:22:51.500 }, 00:22:51.500 { 00:22:51.500 "name": "BaseBdev3", 00:22:51.500 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:51.500 "is_configured": true, 00:22:51.500 "data_offset": 2048, 00:22:51.500 "data_size": 63488 00:22:51.500 }, 00:22:51.500 { 00:22:51.500 "name": "BaseBdev4", 00:22:51.500 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:51.500 "is_configured": true, 00:22:51.500 "data_offset": 2048, 00:22:51.500 "data_size": 63488 00:22:51.500 } 00:22:51.500 ] 00:22:51.500 }' 00:22:51.501 07:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:51.501 07:28:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:52.067 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:52.067 [2024-07-25 07:28:24.587532] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:52.325 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:52.325 "name": "Existed_Raid", 00:22:52.325 "aliases": [ 00:22:52.325 "964125d4-ecf9-48ae-a031-70772451b9a3" 00:22:52.325 ], 00:22:52.325 "product_name": "Raid Volume", 00:22:52.325 "block_size": 512, 00:22:52.325 "num_blocks": 63488, 00:22:52.325 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:52.325 "assigned_rate_limits": { 00:22:52.325 "rw_ios_per_sec": 0, 00:22:52.325 "rw_mbytes_per_sec": 0, 00:22:52.325 "r_mbytes_per_sec": 0, 00:22:52.325 "w_mbytes_per_sec": 0 00:22:52.325 }, 00:22:52.325 "claimed": false, 00:22:52.325 "zoned": false, 00:22:52.325 "supported_io_types": { 00:22:52.325 "read": true, 00:22:52.325 "write": true, 00:22:52.325 "unmap": false, 00:22:52.325 "flush": false, 00:22:52.325 "reset": true, 00:22:52.325 "nvme_admin": false, 00:22:52.325 "nvme_io": false, 00:22:52.325 "nvme_io_md": false, 00:22:52.325 "write_zeroes": true, 00:22:52.325 "zcopy": false, 00:22:52.325 "get_zone_info": false, 00:22:52.325 "zone_management": false, 00:22:52.325 "zone_append": false, 00:22:52.325 "compare": false, 00:22:52.325 "compare_and_write": false, 00:22:52.325 "abort": false, 00:22:52.325 "seek_hole": false, 00:22:52.325 "seek_data": false, 00:22:52.325 "copy": false, 00:22:52.325 "nvme_iov_md": false 00:22:52.325 }, 00:22:52.325 "memory_domains": [ 00:22:52.325 { 00:22:52.325 "dma_device_id": "system", 00:22:52.325 "dma_device_type": 1 00:22:52.325 }, 00:22:52.325 { 00:22:52.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.325 "dma_device_type": 2 00:22:52.325 }, 00:22:52.325 { 00:22:52.325 "dma_device_id": "system", 00:22:52.326 "dma_device_type": 1 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.326 "dma_device_type": 2 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "dma_device_id": "system", 00:22:52.326 "dma_device_type": 1 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.326 "dma_device_type": 2 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "dma_device_id": "system", 00:22:52.326 "dma_device_type": 1 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.326 "dma_device_type": 2 00:22:52.326 } 00:22:52.326 ], 00:22:52.326 "driver_specific": { 00:22:52.326 "raid": { 00:22:52.326 "uuid": "964125d4-ecf9-48ae-a031-70772451b9a3", 00:22:52.326 "strip_size_kb": 0, 00:22:52.326 "state": "online", 00:22:52.326 "raid_level": "raid1", 00:22:52.326 "superblock": true, 00:22:52.326 "num_base_bdevs": 4, 00:22:52.326 "num_base_bdevs_discovered": 4, 00:22:52.326 "num_base_bdevs_operational": 4, 00:22:52.326 "base_bdevs_list": [ 00:22:52.326 { 00:22:52.326 "name": "NewBaseBdev", 00:22:52.326 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:52.326 "is_configured": true, 00:22:52.326 "data_offset": 2048, 00:22:52.326 "data_size": 63488 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "name": "BaseBdev2", 00:22:52.326 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:52.326 "is_configured": true, 00:22:52.326 "data_offset": 2048, 00:22:52.326 "data_size": 63488 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "name": "BaseBdev3", 00:22:52.326 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:52.326 "is_configured": true, 00:22:52.326 "data_offset": 2048, 00:22:52.326 "data_size": 63488 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "name": "BaseBdev4", 00:22:52.326 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:52.326 "is_configured": true, 00:22:52.326 "data_offset": 2048, 00:22:52.326 "data_size": 63488 00:22:52.326 } 00:22:52.326 ] 00:22:52.326 } 00:22:52.326 } 00:22:52.326 }' 00:22:52.326 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:52.326 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:52.326 BaseBdev2 00:22:52.326 BaseBdev3 00:22:52.326 BaseBdev4' 00:22:52.326 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:52.326 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:52.326 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:52.584 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:52.584 "name": "NewBaseBdev", 00:22:52.584 "aliases": [ 00:22:52.584 "22852a07-641f-4464-b1ef-f8455c5ef003" 00:22:52.584 ], 00:22:52.584 "product_name": "Malloc disk", 00:22:52.584 "block_size": 512, 00:22:52.584 "num_blocks": 65536, 00:22:52.584 "uuid": "22852a07-641f-4464-b1ef-f8455c5ef003", 00:22:52.584 "assigned_rate_limits": { 00:22:52.584 "rw_ios_per_sec": 0, 00:22:52.584 "rw_mbytes_per_sec": 0, 00:22:52.584 "r_mbytes_per_sec": 0, 00:22:52.584 "w_mbytes_per_sec": 0 00:22:52.584 }, 00:22:52.584 "claimed": true, 00:22:52.584 "claim_type": "exclusive_write", 00:22:52.584 "zoned": false, 00:22:52.584 "supported_io_types": { 00:22:52.584 "read": true, 00:22:52.584 "write": true, 00:22:52.584 "unmap": true, 00:22:52.584 "flush": true, 00:22:52.584 "reset": true, 00:22:52.584 "nvme_admin": false, 00:22:52.584 "nvme_io": false, 00:22:52.584 "nvme_io_md": false, 00:22:52.584 "write_zeroes": true, 00:22:52.584 "zcopy": true, 00:22:52.584 "get_zone_info": false, 00:22:52.584 "zone_management": false, 00:22:52.584 "zone_append": false, 00:22:52.584 "compare": false, 00:22:52.584 "compare_and_write": false, 00:22:52.584 "abort": true, 00:22:52.584 "seek_hole": false, 00:22:52.584 "seek_data": false, 00:22:52.584 "copy": true, 00:22:52.584 "nvme_iov_md": false 00:22:52.584 }, 00:22:52.584 "memory_domains": [ 00:22:52.584 { 00:22:52.584 "dma_device_id": "system", 00:22:52.584 "dma_device_type": 1 00:22:52.584 }, 00:22:52.584 { 00:22:52.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.584 "dma_device_type": 2 00:22:52.584 } 00:22:52.584 ], 00:22:52.584 "driver_specific": {} 00:22:52.584 }' 00:22:52.584 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:52.584 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:52.584 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:52.584 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:52.584 07:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:52.584 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:52.584 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:52.584 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:52.843 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:53.101 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:53.101 "name": "BaseBdev2", 00:22:53.101 "aliases": [ 00:22:53.101 "5799e743-ba6e-45ff-b83f-707e09d7cf11" 00:22:53.101 ], 00:22:53.101 "product_name": "Malloc disk", 00:22:53.101 "block_size": 512, 00:22:53.101 "num_blocks": 65536, 00:22:53.101 "uuid": "5799e743-ba6e-45ff-b83f-707e09d7cf11", 00:22:53.101 "assigned_rate_limits": { 00:22:53.101 "rw_ios_per_sec": 0, 00:22:53.101 "rw_mbytes_per_sec": 0, 00:22:53.101 "r_mbytes_per_sec": 0, 00:22:53.101 "w_mbytes_per_sec": 0 00:22:53.101 }, 00:22:53.101 "claimed": true, 00:22:53.101 "claim_type": "exclusive_write", 00:22:53.101 "zoned": false, 00:22:53.101 "supported_io_types": { 00:22:53.101 "read": true, 00:22:53.101 "write": true, 00:22:53.101 "unmap": true, 00:22:53.101 "flush": true, 00:22:53.101 "reset": true, 00:22:53.101 "nvme_admin": false, 00:22:53.101 "nvme_io": false, 00:22:53.101 "nvme_io_md": false, 00:22:53.101 "write_zeroes": true, 00:22:53.101 "zcopy": true, 00:22:53.101 "get_zone_info": false, 00:22:53.101 "zone_management": false, 00:22:53.101 "zone_append": false, 00:22:53.101 "compare": false, 00:22:53.101 "compare_and_write": false, 00:22:53.101 "abort": true, 00:22:53.101 "seek_hole": false, 00:22:53.101 "seek_data": false, 00:22:53.101 "copy": true, 00:22:53.101 "nvme_iov_md": false 00:22:53.101 }, 00:22:53.101 "memory_domains": [ 00:22:53.101 { 00:22:53.101 "dma_device_id": "system", 00:22:53.101 "dma_device_type": 1 00:22:53.102 }, 00:22:53.102 { 00:22:53.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:53.102 "dma_device_type": 2 00:22:53.102 } 00:22:53.102 ], 00:22:53.102 "driver_specific": {} 00:22:53.102 }' 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:53.102 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:53.360 07:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:53.617 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:53.617 "name": "BaseBdev3", 00:22:53.617 "aliases": [ 00:22:53.617 "5971c5c1-aeb7-44fc-a0b6-1ea506682c34" 00:22:53.617 ], 00:22:53.617 "product_name": "Malloc disk", 00:22:53.617 "block_size": 512, 00:22:53.617 "num_blocks": 65536, 00:22:53.617 "uuid": "5971c5c1-aeb7-44fc-a0b6-1ea506682c34", 00:22:53.617 "assigned_rate_limits": { 00:22:53.617 "rw_ios_per_sec": 0, 00:22:53.617 "rw_mbytes_per_sec": 0, 00:22:53.617 "r_mbytes_per_sec": 0, 00:22:53.617 "w_mbytes_per_sec": 0 00:22:53.617 }, 00:22:53.617 "claimed": true, 00:22:53.617 "claim_type": "exclusive_write", 00:22:53.617 "zoned": false, 00:22:53.617 "supported_io_types": { 00:22:53.617 "read": true, 00:22:53.617 "write": true, 00:22:53.617 "unmap": true, 00:22:53.617 "flush": true, 00:22:53.617 "reset": true, 00:22:53.617 "nvme_admin": false, 00:22:53.617 "nvme_io": false, 00:22:53.617 "nvme_io_md": false, 00:22:53.617 "write_zeroes": true, 00:22:53.617 "zcopy": true, 00:22:53.617 "get_zone_info": false, 00:22:53.617 "zone_management": false, 00:22:53.617 "zone_append": false, 00:22:53.617 "compare": false, 00:22:53.617 "compare_and_write": false, 00:22:53.617 "abort": true, 00:22:53.617 "seek_hole": false, 00:22:53.617 "seek_data": false, 00:22:53.617 "copy": true, 00:22:53.618 "nvme_iov_md": false 00:22:53.618 }, 00:22:53.618 "memory_domains": [ 00:22:53.618 { 00:22:53.618 "dma_device_id": "system", 00:22:53.618 "dma_device_type": 1 00:22:53.618 }, 00:22:53.618 { 00:22:53.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:53.618 "dma_device_type": 2 00:22:53.618 } 00:22:53.618 ], 00:22:53.618 "driver_specific": {} 00:22:53.618 }' 00:22:53.618 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:53.618 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:53.618 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:53.618 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:53.618 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:53.876 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:54.134 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:54.134 "name": "BaseBdev4", 00:22:54.134 "aliases": [ 00:22:54.134 "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d" 00:22:54.134 ], 00:22:54.134 "product_name": "Malloc disk", 00:22:54.134 "block_size": 512, 00:22:54.134 "num_blocks": 65536, 00:22:54.134 "uuid": "5bad2a70-83e5-4a08-9f4b-0c19a5befc6d", 00:22:54.134 "assigned_rate_limits": { 00:22:54.134 "rw_ios_per_sec": 0, 00:22:54.134 "rw_mbytes_per_sec": 0, 00:22:54.134 "r_mbytes_per_sec": 0, 00:22:54.134 "w_mbytes_per_sec": 0 00:22:54.134 }, 00:22:54.134 "claimed": true, 00:22:54.134 "claim_type": "exclusive_write", 00:22:54.134 "zoned": false, 00:22:54.134 "supported_io_types": { 00:22:54.134 "read": true, 00:22:54.134 "write": true, 00:22:54.134 "unmap": true, 00:22:54.134 "flush": true, 00:22:54.134 "reset": true, 00:22:54.134 "nvme_admin": false, 00:22:54.134 "nvme_io": false, 00:22:54.134 "nvme_io_md": false, 00:22:54.134 "write_zeroes": true, 00:22:54.134 "zcopy": true, 00:22:54.134 "get_zone_info": false, 00:22:54.134 "zone_management": false, 00:22:54.134 "zone_append": false, 00:22:54.134 "compare": false, 00:22:54.134 "compare_and_write": false, 00:22:54.134 "abort": true, 00:22:54.134 "seek_hole": false, 00:22:54.134 "seek_data": false, 00:22:54.134 "copy": true, 00:22:54.134 "nvme_iov_md": false 00:22:54.134 }, 00:22:54.134 "memory_domains": [ 00:22:54.134 { 00:22:54.134 "dma_device_id": "system", 00:22:54.134 "dma_device_type": 1 00:22:54.134 }, 00:22:54.134 { 00:22:54.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.134 "dma_device_type": 2 00:22:54.134 } 00:22:54.134 ], 00:22:54.134 "driver_specific": {} 00:22:54.134 }' 00:22:54.134 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.134 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.134 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:54.134 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:54.393 07:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:54.651 [2024-07-25 07:28:27.133959] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:54.651 [2024-07-25 07:28:27.133981] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:54.651 [2024-07-25 07:28:27.134028] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:54.651 [2024-07-25 07:28:27.134281] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:54.651 [2024-07-25 07:28:27.134293] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2717a30 name Existed_Raid, state offline 00:22:54.651 07:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1701439 00:22:54.651 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1701439 ']' 00:22:54.651 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1701439 00:22:54.651 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:22:54.651 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:54.651 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1701439 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1701439' 00:22:54.910 killing process with pid 1701439 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1701439 00:22:54.910 [2024-07-25 07:28:27.206446] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1701439 00:22:54.910 [2024-07-25 07:28:27.238710] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:54.910 00:22:54.910 real 0m31.575s 00:22:54.910 user 0m57.865s 00:22:54.910 sys 0m5.759s 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:54.910 07:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:54.910 ************************************ 00:22:54.910 END TEST raid_state_function_test_sb 00:22:54.910 ************************************ 00:22:55.168 07:28:27 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:22:55.168 07:28:27 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:22:55.168 07:28:27 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:55.168 07:28:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:55.168 ************************************ 00:22:55.168 START TEST raid_superblock_test 00:22:55.168 ************************************ 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 4 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:22:55.168 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1707390 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1707390 /var/tmp/spdk-raid.sock 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1707390 ']' 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:55.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:55.169 07:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:55.169 [2024-07-25 07:28:27.573720] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:22:55.169 [2024-07-25 07:28:27.573776] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707390 ] 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.0 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.1 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.2 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.3 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.4 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.5 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.6 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:01.7 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.0 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.1 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.2 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.3 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.4 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.5 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.6 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3d:02.7 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.0 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.1 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.2 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.3 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.4 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.5 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.6 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:01.7 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.0 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.1 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.2 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.3 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.4 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.5 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.6 cannot be used 00:22:55.169 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:55.169 EAL: Requested device 0000:3f:02.7 cannot be used 00:22:55.428 [2024-07-25 07:28:27.703260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:55.428 [2024-07-25 07:28:27.789647] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:22:55.428 [2024-07-25 07:28:27.850925] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:55.428 [2024-07-25 07:28:27.850966] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:56.002 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:56.283 malloc1 00:22:56.283 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:56.541 [2024-07-25 07:28:28.916027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:56.541 [2024-07-25 07:28:28.916069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:56.541 [2024-07-25 07:28:28.916088] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2164280 00:22:56.541 [2024-07-25 07:28:28.916099] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:56.541 [2024-07-25 07:28:28.917653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:56.541 [2024-07-25 07:28:28.917681] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:56.541 pt1 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:56.541 07:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:56.799 malloc2 00:22:56.799 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:57.057 [2024-07-25 07:28:29.377737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:57.057 [2024-07-25 07:28:29.377782] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.057 [2024-07-25 07:28:29.377797] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230f8c0 00:22:57.057 [2024-07-25 07:28:29.377809] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.057 [2024-07-25 07:28:29.379207] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.057 [2024-07-25 07:28:29.379235] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:57.057 pt2 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:57.057 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:57.315 malloc3 00:22:57.315 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:57.315 [2024-07-25 07:28:29.823322] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:57.315 [2024-07-25 07:28:29.823366] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.315 [2024-07-25 07:28:29.823387] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230fef0 00:22:57.315 [2024-07-25 07:28:29.823398] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.316 [2024-07-25 07:28:29.824706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.316 [2024-07-25 07:28:29.824733] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:57.316 pt3 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:57.316 07:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:57.574 malloc4 00:22:57.574 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:57.832 [2024-07-25 07:28:30.269349] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:57.832 [2024-07-25 07:28:30.269396] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.832 [2024-07-25 07:28:30.269413] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2313330 00:22:57.832 [2024-07-25 07:28:30.269425] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.832 [2024-07-25 07:28:30.270855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.832 [2024-07-25 07:28:30.270882] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:57.832 pt4 00:22:57.832 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:57.832 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:57.832 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:58.091 [2024-07-25 07:28:30.481928] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:58.091 [2024-07-25 07:28:30.483060] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:58.091 [2024-07-25 07:28:30.483111] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:58.091 [2024-07-25 07:28:30.483158] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:58.091 [2024-07-25 07:28:30.483322] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2312720 00:22:58.091 [2024-07-25 07:28:30.483332] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:58.091 [2024-07-25 07:28:30.483511] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2315e30 00:22:58.091 [2024-07-25 07:28:30.483649] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2312720 00:22:58.091 [2024-07-25 07:28:30.483658] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2312720 00:22:58.091 [2024-07-25 07:28:30.483750] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.091 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.349 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.349 "name": "raid_bdev1", 00:22:58.349 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:22:58.349 "strip_size_kb": 0, 00:22:58.349 "state": "online", 00:22:58.349 "raid_level": "raid1", 00:22:58.349 "superblock": true, 00:22:58.349 "num_base_bdevs": 4, 00:22:58.349 "num_base_bdevs_discovered": 4, 00:22:58.349 "num_base_bdevs_operational": 4, 00:22:58.349 "base_bdevs_list": [ 00:22:58.349 { 00:22:58.349 "name": "pt1", 00:22:58.349 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:58.349 "is_configured": true, 00:22:58.349 "data_offset": 2048, 00:22:58.349 "data_size": 63488 00:22:58.349 }, 00:22:58.349 { 00:22:58.349 "name": "pt2", 00:22:58.349 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:58.349 "is_configured": true, 00:22:58.349 "data_offset": 2048, 00:22:58.349 "data_size": 63488 00:22:58.349 }, 00:22:58.349 { 00:22:58.349 "name": "pt3", 00:22:58.349 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:58.349 "is_configured": true, 00:22:58.349 "data_offset": 2048, 00:22:58.349 "data_size": 63488 00:22:58.349 }, 00:22:58.349 { 00:22:58.349 "name": "pt4", 00:22:58.349 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:58.349 "is_configured": true, 00:22:58.349 "data_offset": 2048, 00:22:58.349 "data_size": 63488 00:22:58.349 } 00:22:58.349 ] 00:22:58.349 }' 00:22:58.349 07:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.349 07:28:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:58.915 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:58.916 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:59.174 [2024-07-25 07:28:31.480797] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:59.174 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:59.174 "name": "raid_bdev1", 00:22:59.174 "aliases": [ 00:22:59.174 "ddcecdd8-f9bf-44b8-8cc5-946205984c83" 00:22:59.174 ], 00:22:59.174 "product_name": "Raid Volume", 00:22:59.174 "block_size": 512, 00:22:59.174 "num_blocks": 63488, 00:22:59.174 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:22:59.174 "assigned_rate_limits": { 00:22:59.174 "rw_ios_per_sec": 0, 00:22:59.174 "rw_mbytes_per_sec": 0, 00:22:59.174 "r_mbytes_per_sec": 0, 00:22:59.174 "w_mbytes_per_sec": 0 00:22:59.174 }, 00:22:59.174 "claimed": false, 00:22:59.174 "zoned": false, 00:22:59.174 "supported_io_types": { 00:22:59.174 "read": true, 00:22:59.174 "write": true, 00:22:59.174 "unmap": false, 00:22:59.174 "flush": false, 00:22:59.174 "reset": true, 00:22:59.174 "nvme_admin": false, 00:22:59.174 "nvme_io": false, 00:22:59.174 "nvme_io_md": false, 00:22:59.174 "write_zeroes": true, 00:22:59.174 "zcopy": false, 00:22:59.174 "get_zone_info": false, 00:22:59.174 "zone_management": false, 00:22:59.174 "zone_append": false, 00:22:59.174 "compare": false, 00:22:59.174 "compare_and_write": false, 00:22:59.174 "abort": false, 00:22:59.174 "seek_hole": false, 00:22:59.174 "seek_data": false, 00:22:59.174 "copy": false, 00:22:59.174 "nvme_iov_md": false 00:22:59.174 }, 00:22:59.174 "memory_domains": [ 00:22:59.174 { 00:22:59.174 "dma_device_id": "system", 00:22:59.174 "dma_device_type": 1 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.174 "dma_device_type": 2 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "system", 00:22:59.174 "dma_device_type": 1 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.174 "dma_device_type": 2 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "system", 00:22:59.174 "dma_device_type": 1 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.174 "dma_device_type": 2 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "system", 00:22:59.174 "dma_device_type": 1 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.174 "dma_device_type": 2 00:22:59.174 } 00:22:59.174 ], 00:22:59.174 "driver_specific": { 00:22:59.174 "raid": { 00:22:59.174 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:22:59.174 "strip_size_kb": 0, 00:22:59.174 "state": "online", 00:22:59.174 "raid_level": "raid1", 00:22:59.174 "superblock": true, 00:22:59.174 "num_base_bdevs": 4, 00:22:59.174 "num_base_bdevs_discovered": 4, 00:22:59.174 "num_base_bdevs_operational": 4, 00:22:59.174 "base_bdevs_list": [ 00:22:59.174 { 00:22:59.174 "name": "pt1", 00:22:59.174 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:59.174 "is_configured": true, 00:22:59.174 "data_offset": 2048, 00:22:59.174 "data_size": 63488 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "name": "pt2", 00:22:59.174 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:59.174 "is_configured": true, 00:22:59.174 "data_offset": 2048, 00:22:59.174 "data_size": 63488 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "name": "pt3", 00:22:59.174 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:59.174 "is_configured": true, 00:22:59.174 "data_offset": 2048, 00:22:59.174 "data_size": 63488 00:22:59.174 }, 00:22:59.174 { 00:22:59.174 "name": "pt4", 00:22:59.174 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:59.174 "is_configured": true, 00:22:59.174 "data_offset": 2048, 00:22:59.174 "data_size": 63488 00:22:59.174 } 00:22:59.174 ] 00:22:59.174 } 00:22:59.174 } 00:22:59.174 }' 00:22:59.174 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:59.174 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:59.174 pt2 00:22:59.174 pt3 00:22:59.174 pt4' 00:22:59.174 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:59.174 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:59.174 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:59.432 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:59.432 "name": "pt1", 00:22:59.432 "aliases": [ 00:22:59.432 "00000000-0000-0000-0000-000000000001" 00:22:59.432 ], 00:22:59.432 "product_name": "passthru", 00:22:59.432 "block_size": 512, 00:22:59.432 "num_blocks": 65536, 00:22:59.432 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:59.432 "assigned_rate_limits": { 00:22:59.432 "rw_ios_per_sec": 0, 00:22:59.432 "rw_mbytes_per_sec": 0, 00:22:59.432 "r_mbytes_per_sec": 0, 00:22:59.432 "w_mbytes_per_sec": 0 00:22:59.432 }, 00:22:59.432 "claimed": true, 00:22:59.432 "claim_type": "exclusive_write", 00:22:59.432 "zoned": false, 00:22:59.432 "supported_io_types": { 00:22:59.432 "read": true, 00:22:59.432 "write": true, 00:22:59.432 "unmap": true, 00:22:59.432 "flush": true, 00:22:59.432 "reset": true, 00:22:59.432 "nvme_admin": false, 00:22:59.432 "nvme_io": false, 00:22:59.432 "nvme_io_md": false, 00:22:59.432 "write_zeroes": true, 00:22:59.432 "zcopy": true, 00:22:59.432 "get_zone_info": false, 00:22:59.432 "zone_management": false, 00:22:59.432 "zone_append": false, 00:22:59.432 "compare": false, 00:22:59.432 "compare_and_write": false, 00:22:59.432 "abort": true, 00:22:59.432 "seek_hole": false, 00:22:59.432 "seek_data": false, 00:22:59.432 "copy": true, 00:22:59.432 "nvme_iov_md": false 00:22:59.432 }, 00:22:59.432 "memory_domains": [ 00:22:59.432 { 00:22:59.432 "dma_device_id": "system", 00:22:59.432 "dma_device_type": 1 00:22:59.432 }, 00:22:59.432 { 00:22:59.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.432 "dma_device_type": 2 00:22:59.432 } 00:22:59.432 ], 00:22:59.432 "driver_specific": { 00:22:59.433 "passthru": { 00:22:59.433 "name": "pt1", 00:22:59.433 "base_bdev_name": "malloc1" 00:22:59.433 } 00:22:59.433 } 00:22:59.433 }' 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:59.433 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.691 07:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:59.691 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:59.950 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:59.950 "name": "pt2", 00:22:59.950 "aliases": [ 00:22:59.950 "00000000-0000-0000-0000-000000000002" 00:22:59.950 ], 00:22:59.950 "product_name": "passthru", 00:22:59.950 "block_size": 512, 00:22:59.950 "num_blocks": 65536, 00:22:59.950 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:59.950 "assigned_rate_limits": { 00:22:59.950 "rw_ios_per_sec": 0, 00:22:59.950 "rw_mbytes_per_sec": 0, 00:22:59.950 "r_mbytes_per_sec": 0, 00:22:59.950 "w_mbytes_per_sec": 0 00:22:59.950 }, 00:22:59.950 "claimed": true, 00:22:59.950 "claim_type": "exclusive_write", 00:22:59.950 "zoned": false, 00:22:59.950 "supported_io_types": { 00:22:59.950 "read": true, 00:22:59.950 "write": true, 00:22:59.950 "unmap": true, 00:22:59.950 "flush": true, 00:22:59.950 "reset": true, 00:22:59.950 "nvme_admin": false, 00:22:59.950 "nvme_io": false, 00:22:59.950 "nvme_io_md": false, 00:22:59.950 "write_zeroes": true, 00:22:59.950 "zcopy": true, 00:22:59.950 "get_zone_info": false, 00:22:59.950 "zone_management": false, 00:22:59.950 "zone_append": false, 00:22:59.950 "compare": false, 00:22:59.950 "compare_and_write": false, 00:22:59.950 "abort": true, 00:22:59.950 "seek_hole": false, 00:22:59.950 "seek_data": false, 00:22:59.950 "copy": true, 00:22:59.950 "nvme_iov_md": false 00:22:59.950 }, 00:22:59.950 "memory_domains": [ 00:22:59.950 { 00:22:59.950 "dma_device_id": "system", 00:22:59.950 "dma_device_type": 1 00:22:59.950 }, 00:22:59.950 { 00:22:59.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.950 "dma_device_type": 2 00:22:59.950 } 00:22:59.950 ], 00:22:59.950 "driver_specific": { 00:22:59.950 "passthru": { 00:22:59.950 "name": "pt2", 00:22:59.950 "base_bdev_name": "malloc2" 00:22:59.950 } 00:22:59.950 } 00:22:59.950 }' 00:22:59.950 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.950 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.950 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:59.950 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.207 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.207 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:00.207 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.208 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:00.465 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.465 "name": "pt3", 00:23:00.465 "aliases": [ 00:23:00.465 "00000000-0000-0000-0000-000000000003" 00:23:00.465 ], 00:23:00.465 "product_name": "passthru", 00:23:00.465 "block_size": 512, 00:23:00.465 "num_blocks": 65536, 00:23:00.465 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:00.465 "assigned_rate_limits": { 00:23:00.465 "rw_ios_per_sec": 0, 00:23:00.465 "rw_mbytes_per_sec": 0, 00:23:00.465 "r_mbytes_per_sec": 0, 00:23:00.465 "w_mbytes_per_sec": 0 00:23:00.465 }, 00:23:00.465 "claimed": true, 00:23:00.465 "claim_type": "exclusive_write", 00:23:00.465 "zoned": false, 00:23:00.465 "supported_io_types": { 00:23:00.465 "read": true, 00:23:00.465 "write": true, 00:23:00.465 "unmap": true, 00:23:00.465 "flush": true, 00:23:00.465 "reset": true, 00:23:00.465 "nvme_admin": false, 00:23:00.465 "nvme_io": false, 00:23:00.465 "nvme_io_md": false, 00:23:00.465 "write_zeroes": true, 00:23:00.465 "zcopy": true, 00:23:00.465 "get_zone_info": false, 00:23:00.465 "zone_management": false, 00:23:00.465 "zone_append": false, 00:23:00.465 "compare": false, 00:23:00.465 "compare_and_write": false, 00:23:00.465 "abort": true, 00:23:00.466 "seek_hole": false, 00:23:00.466 "seek_data": false, 00:23:00.466 "copy": true, 00:23:00.466 "nvme_iov_md": false 00:23:00.466 }, 00:23:00.466 "memory_domains": [ 00:23:00.466 { 00:23:00.466 "dma_device_id": "system", 00:23:00.466 "dma_device_type": 1 00:23:00.466 }, 00:23:00.466 { 00:23:00.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.466 "dma_device_type": 2 00:23:00.466 } 00:23:00.466 ], 00:23:00.466 "driver_specific": { 00:23:00.466 "passthru": { 00:23:00.466 "name": "pt3", 00:23:00.466 "base_bdev_name": "malloc3" 00:23:00.466 } 00:23:00.466 } 00:23:00.466 }' 00:23:00.466 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.466 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.466 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:00.466 07:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:00.724 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.982 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.982 "name": "pt4", 00:23:00.982 "aliases": [ 00:23:00.982 "00000000-0000-0000-0000-000000000004" 00:23:00.982 ], 00:23:00.982 "product_name": "passthru", 00:23:00.982 "block_size": 512, 00:23:00.982 "num_blocks": 65536, 00:23:00.982 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:00.982 "assigned_rate_limits": { 00:23:00.982 "rw_ios_per_sec": 0, 00:23:00.982 "rw_mbytes_per_sec": 0, 00:23:00.982 "r_mbytes_per_sec": 0, 00:23:00.982 "w_mbytes_per_sec": 0 00:23:00.982 }, 00:23:00.982 "claimed": true, 00:23:00.982 "claim_type": "exclusive_write", 00:23:00.982 "zoned": false, 00:23:00.982 "supported_io_types": { 00:23:00.982 "read": true, 00:23:00.982 "write": true, 00:23:00.982 "unmap": true, 00:23:00.982 "flush": true, 00:23:00.982 "reset": true, 00:23:00.982 "nvme_admin": false, 00:23:00.982 "nvme_io": false, 00:23:00.982 "nvme_io_md": false, 00:23:00.982 "write_zeroes": true, 00:23:00.982 "zcopy": true, 00:23:00.982 "get_zone_info": false, 00:23:00.982 "zone_management": false, 00:23:00.982 "zone_append": false, 00:23:00.982 "compare": false, 00:23:00.982 "compare_and_write": false, 00:23:00.982 "abort": true, 00:23:00.982 "seek_hole": false, 00:23:00.982 "seek_data": false, 00:23:00.982 "copy": true, 00:23:00.982 "nvme_iov_md": false 00:23:00.982 }, 00:23:00.982 "memory_domains": [ 00:23:00.982 { 00:23:00.982 "dma_device_id": "system", 00:23:00.982 "dma_device_type": 1 00:23:00.982 }, 00:23:00.982 { 00:23:00.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.982 "dma_device_type": 2 00:23:00.982 } 00:23:00.982 ], 00:23:00.982 "driver_specific": { 00:23:00.982 "passthru": { 00:23:00.982 "name": "pt4", 00:23:00.982 "base_bdev_name": "malloc4" 00:23:00.982 } 00:23:00.982 } 00:23:00.982 }' 00:23:00.982 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.240 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.498 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:01.498 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:01.498 07:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:23:01.498 [2024-07-25 07:28:34.023496] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:01.756 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ddcecdd8-f9bf-44b8-8cc5-946205984c83 00:23:01.756 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ddcecdd8-f9bf-44b8-8cc5-946205984c83 ']' 00:23:01.756 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:01.756 [2024-07-25 07:28:34.247799] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:01.756 [2024-07-25 07:28:34.247818] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:01.756 [2024-07-25 07:28:34.247865] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:01.756 [2024-07-25 07:28:34.247939] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:01.756 [2024-07-25 07:28:34.247950] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2312720 name raid_bdev1, state offline 00:23:01.756 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.756 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:23:02.014 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:23:02.014 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:23:02.014 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:02.014 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:02.273 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:02.273 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:02.531 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:02.531 07:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:02.789 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:02.789 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:03.058 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:03.058 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:03.316 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:03.316 [2024-07-25 07:28:35.827867] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:03.316 [2024-07-25 07:28:35.829105] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:03.316 [2024-07-25 07:28:35.829154] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:23:03.317 [2024-07-25 07:28:35.829186] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:23:03.317 [2024-07-25 07:28:35.829226] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:03.317 [2024-07-25 07:28:35.829263] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:03.317 [2024-07-25 07:28:35.829284] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:23:03.317 [2024-07-25 07:28:35.829304] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:23:03.317 [2024-07-25 07:28:35.829321] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:03.317 [2024-07-25 07:28:35.829330] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2313f00 name raid_bdev1, state configuring 00:23:03.317 request: 00:23:03.317 { 00:23:03.317 "name": "raid_bdev1", 00:23:03.317 "raid_level": "raid1", 00:23:03.317 "base_bdevs": [ 00:23:03.317 "malloc1", 00:23:03.317 "malloc2", 00:23:03.317 "malloc3", 00:23:03.317 "malloc4" 00:23:03.317 ], 00:23:03.317 "superblock": false, 00:23:03.317 "method": "bdev_raid_create", 00:23:03.317 "req_id": 1 00:23:03.317 } 00:23:03.317 Got JSON-RPC error response 00:23:03.317 response: 00:23:03.317 { 00:23:03.317 "code": -17, 00:23:03.317 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:03.317 } 00:23:03.317 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:23:03.317 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:23:03.317 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:23:03.317 07:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:23:03.575 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.575 07:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:23:03.575 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:23:03.575 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:23:03.575 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:03.833 [2024-07-25 07:28:36.289023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:03.833 [2024-07-25 07:28:36.289061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.833 [2024-07-25 07:28:36.289080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230d490 00:23:03.833 [2024-07-25 07:28:36.289091] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.833 [2024-07-25 07:28:36.290558] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.833 [2024-07-25 07:28:36.290586] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:03.833 [2024-07-25 07:28:36.290643] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:03.833 [2024-07-25 07:28:36.290667] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:03.833 pt1 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.833 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.092 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.092 "name": "raid_bdev1", 00:23:04.092 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:04.092 "strip_size_kb": 0, 00:23:04.092 "state": "configuring", 00:23:04.092 "raid_level": "raid1", 00:23:04.092 "superblock": true, 00:23:04.092 "num_base_bdevs": 4, 00:23:04.092 "num_base_bdevs_discovered": 1, 00:23:04.092 "num_base_bdevs_operational": 4, 00:23:04.092 "base_bdevs_list": [ 00:23:04.092 { 00:23:04.092 "name": "pt1", 00:23:04.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:04.092 "is_configured": true, 00:23:04.092 "data_offset": 2048, 00:23:04.092 "data_size": 63488 00:23:04.092 }, 00:23:04.092 { 00:23:04.092 "name": null, 00:23:04.092 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:04.092 "is_configured": false, 00:23:04.092 "data_offset": 2048, 00:23:04.092 "data_size": 63488 00:23:04.092 }, 00:23:04.092 { 00:23:04.092 "name": null, 00:23:04.092 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:04.092 "is_configured": false, 00:23:04.092 "data_offset": 2048, 00:23:04.092 "data_size": 63488 00:23:04.092 }, 00:23:04.092 { 00:23:04.092 "name": null, 00:23:04.092 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:04.092 "is_configured": false, 00:23:04.092 "data_offset": 2048, 00:23:04.092 "data_size": 63488 00:23:04.092 } 00:23:04.092 ] 00:23:04.092 }' 00:23:04.092 07:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.092 07:28:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:04.659 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:23:04.659 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:04.918 [2024-07-25 07:28:37.299742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:04.918 [2024-07-25 07:28:37.299786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.918 [2024-07-25 07:28:37.299802] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230e2b0 00:23:04.918 [2024-07-25 07:28:37.299813] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.918 [2024-07-25 07:28:37.300119] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.918 [2024-07-25 07:28:37.300135] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:04.918 [2024-07-25 07:28:37.300199] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:04.918 [2024-07-25 07:28:37.300218] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:04.918 pt2 00:23:04.918 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:05.176 [2024-07-25 07:28:37.516318] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.176 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.434 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.434 "name": "raid_bdev1", 00:23:05.434 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:05.434 "strip_size_kb": 0, 00:23:05.434 "state": "configuring", 00:23:05.434 "raid_level": "raid1", 00:23:05.434 "superblock": true, 00:23:05.434 "num_base_bdevs": 4, 00:23:05.434 "num_base_bdevs_discovered": 1, 00:23:05.434 "num_base_bdevs_operational": 4, 00:23:05.434 "base_bdevs_list": [ 00:23:05.434 { 00:23:05.434 "name": "pt1", 00:23:05.434 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:05.434 "is_configured": true, 00:23:05.434 "data_offset": 2048, 00:23:05.434 "data_size": 63488 00:23:05.434 }, 00:23:05.434 { 00:23:05.434 "name": null, 00:23:05.434 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:05.434 "is_configured": false, 00:23:05.434 "data_offset": 2048, 00:23:05.434 "data_size": 63488 00:23:05.434 }, 00:23:05.434 { 00:23:05.434 "name": null, 00:23:05.434 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:05.434 "is_configured": false, 00:23:05.434 "data_offset": 2048, 00:23:05.434 "data_size": 63488 00:23:05.434 }, 00:23:05.434 { 00:23:05.434 "name": null, 00:23:05.434 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:05.434 "is_configured": false, 00:23:05.434 "data_offset": 2048, 00:23:05.434 "data_size": 63488 00:23:05.434 } 00:23:05.434 ] 00:23:05.434 }' 00:23:05.434 07:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.434 07:28:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.000 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:23:06.000 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:06.000 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:06.000 [2024-07-25 07:28:38.526994] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:06.000 [2024-07-25 07:28:38.527039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.000 [2024-07-25 07:28:38.527056] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230eea0 00:23:06.000 [2024-07-25 07:28:38.527068] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.000 [2024-07-25 07:28:38.527378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.000 [2024-07-25 07:28:38.527394] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:06.000 [2024-07-25 07:28:38.527449] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:06.000 [2024-07-25 07:28:38.527466] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:06.000 pt2 00:23:06.258 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:23:06.258 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:06.258 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:06.258 [2024-07-25 07:28:38.695438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:06.258 [2024-07-25 07:28:38.695468] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.258 [2024-07-25 07:28:38.695484] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2313ca0 00:23:06.258 [2024-07-25 07:28:38.695495] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.258 [2024-07-25 07:28:38.695746] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.258 [2024-07-25 07:28:38.695761] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:06.258 [2024-07-25 07:28:38.695805] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:06.258 [2024-07-25 07:28:38.695821] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:06.258 pt3 00:23:06.258 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:23:06.258 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:06.259 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:06.517 [2024-07-25 07:28:38.859868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:06.517 [2024-07-25 07:28:38.859894] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.517 [2024-07-25 07:28:38.859909] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2162bc0 00:23:06.517 [2024-07-25 07:28:38.859919] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.517 [2024-07-25 07:28:38.860166] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.517 [2024-07-25 07:28:38.860181] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:06.517 [2024-07-25 07:28:38.860223] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:06.517 [2024-07-25 07:28:38.860238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:06.517 [2024-07-25 07:28:38.860343] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x230e740 00:23:06.517 [2024-07-25 07:28:38.860353] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:06.517 [2024-07-25 07:28:38.860502] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2311cf0 00:23:06.517 [2024-07-25 07:28:38.860624] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x230e740 00:23:06.517 [2024-07-25 07:28:38.860633] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x230e740 00:23:06.517 [2024-07-25 07:28:38.860721] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.517 pt4 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.517 07:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.775 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.775 "name": "raid_bdev1", 00:23:06.775 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:06.775 "strip_size_kb": 0, 00:23:06.775 "state": "online", 00:23:06.775 "raid_level": "raid1", 00:23:06.775 "superblock": true, 00:23:06.775 "num_base_bdevs": 4, 00:23:06.775 "num_base_bdevs_discovered": 4, 00:23:06.775 "num_base_bdevs_operational": 4, 00:23:06.775 "base_bdevs_list": [ 00:23:06.775 { 00:23:06.775 "name": "pt1", 00:23:06.775 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:06.775 "is_configured": true, 00:23:06.775 "data_offset": 2048, 00:23:06.775 "data_size": 63488 00:23:06.775 }, 00:23:06.775 { 00:23:06.775 "name": "pt2", 00:23:06.775 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:06.775 "is_configured": true, 00:23:06.775 "data_offset": 2048, 00:23:06.775 "data_size": 63488 00:23:06.775 }, 00:23:06.775 { 00:23:06.775 "name": "pt3", 00:23:06.775 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:06.775 "is_configured": true, 00:23:06.775 "data_offset": 2048, 00:23:06.775 "data_size": 63488 00:23:06.775 }, 00:23:06.775 { 00:23:06.775 "name": "pt4", 00:23:06.775 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:06.775 "is_configured": true, 00:23:06.775 "data_offset": 2048, 00:23:06.775 "data_size": 63488 00:23:06.775 } 00:23:06.775 ] 00:23:06.775 }' 00:23:06.775 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.775 07:28:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:07.341 [2024-07-25 07:28:39.830718] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:07.341 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:07.341 "name": "raid_bdev1", 00:23:07.341 "aliases": [ 00:23:07.341 "ddcecdd8-f9bf-44b8-8cc5-946205984c83" 00:23:07.341 ], 00:23:07.341 "product_name": "Raid Volume", 00:23:07.341 "block_size": 512, 00:23:07.341 "num_blocks": 63488, 00:23:07.341 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:07.341 "assigned_rate_limits": { 00:23:07.341 "rw_ios_per_sec": 0, 00:23:07.341 "rw_mbytes_per_sec": 0, 00:23:07.341 "r_mbytes_per_sec": 0, 00:23:07.341 "w_mbytes_per_sec": 0 00:23:07.341 }, 00:23:07.341 "claimed": false, 00:23:07.341 "zoned": false, 00:23:07.341 "supported_io_types": { 00:23:07.341 "read": true, 00:23:07.341 "write": true, 00:23:07.341 "unmap": false, 00:23:07.341 "flush": false, 00:23:07.341 "reset": true, 00:23:07.341 "nvme_admin": false, 00:23:07.341 "nvme_io": false, 00:23:07.341 "nvme_io_md": false, 00:23:07.341 "write_zeroes": true, 00:23:07.341 "zcopy": false, 00:23:07.341 "get_zone_info": false, 00:23:07.341 "zone_management": false, 00:23:07.341 "zone_append": false, 00:23:07.341 "compare": false, 00:23:07.341 "compare_and_write": false, 00:23:07.341 "abort": false, 00:23:07.341 "seek_hole": false, 00:23:07.341 "seek_data": false, 00:23:07.341 "copy": false, 00:23:07.341 "nvme_iov_md": false 00:23:07.341 }, 00:23:07.341 "memory_domains": [ 00:23:07.341 { 00:23:07.341 "dma_device_id": "system", 00:23:07.341 "dma_device_type": 1 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.342 "dma_device_type": 2 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "system", 00:23:07.342 "dma_device_type": 1 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.342 "dma_device_type": 2 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "system", 00:23:07.342 "dma_device_type": 1 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.342 "dma_device_type": 2 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "system", 00:23:07.342 "dma_device_type": 1 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.342 "dma_device_type": 2 00:23:07.342 } 00:23:07.342 ], 00:23:07.342 "driver_specific": { 00:23:07.342 "raid": { 00:23:07.342 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:07.342 "strip_size_kb": 0, 00:23:07.342 "state": "online", 00:23:07.342 "raid_level": "raid1", 00:23:07.342 "superblock": true, 00:23:07.342 "num_base_bdevs": 4, 00:23:07.342 "num_base_bdevs_discovered": 4, 00:23:07.342 "num_base_bdevs_operational": 4, 00:23:07.342 "base_bdevs_list": [ 00:23:07.342 { 00:23:07.342 "name": "pt1", 00:23:07.342 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:07.342 "is_configured": true, 00:23:07.342 "data_offset": 2048, 00:23:07.342 "data_size": 63488 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "name": "pt2", 00:23:07.342 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:07.342 "is_configured": true, 00:23:07.342 "data_offset": 2048, 00:23:07.342 "data_size": 63488 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "name": "pt3", 00:23:07.342 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:07.342 "is_configured": true, 00:23:07.342 "data_offset": 2048, 00:23:07.342 "data_size": 63488 00:23:07.342 }, 00:23:07.342 { 00:23:07.342 "name": "pt4", 00:23:07.342 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:07.342 "is_configured": true, 00:23:07.342 "data_offset": 2048, 00:23:07.342 "data_size": 63488 00:23:07.342 } 00:23:07.342 ] 00:23:07.342 } 00:23:07.342 } 00:23:07.342 }' 00:23:07.342 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:07.600 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:07.600 pt2 00:23:07.600 pt3 00:23:07.600 pt4' 00:23:07.600 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:07.600 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:07.600 07:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:07.600 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:07.600 "name": "pt1", 00:23:07.600 "aliases": [ 00:23:07.600 "00000000-0000-0000-0000-000000000001" 00:23:07.600 ], 00:23:07.601 "product_name": "passthru", 00:23:07.601 "block_size": 512, 00:23:07.601 "num_blocks": 65536, 00:23:07.601 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:07.601 "assigned_rate_limits": { 00:23:07.601 "rw_ios_per_sec": 0, 00:23:07.601 "rw_mbytes_per_sec": 0, 00:23:07.601 "r_mbytes_per_sec": 0, 00:23:07.601 "w_mbytes_per_sec": 0 00:23:07.601 }, 00:23:07.601 "claimed": true, 00:23:07.601 "claim_type": "exclusive_write", 00:23:07.601 "zoned": false, 00:23:07.601 "supported_io_types": { 00:23:07.601 "read": true, 00:23:07.601 "write": true, 00:23:07.601 "unmap": true, 00:23:07.601 "flush": true, 00:23:07.601 "reset": true, 00:23:07.601 "nvme_admin": false, 00:23:07.601 "nvme_io": false, 00:23:07.601 "nvme_io_md": false, 00:23:07.601 "write_zeroes": true, 00:23:07.601 "zcopy": true, 00:23:07.601 "get_zone_info": false, 00:23:07.601 "zone_management": false, 00:23:07.601 "zone_append": false, 00:23:07.601 "compare": false, 00:23:07.601 "compare_and_write": false, 00:23:07.601 "abort": true, 00:23:07.601 "seek_hole": false, 00:23:07.601 "seek_data": false, 00:23:07.601 "copy": true, 00:23:07.601 "nvme_iov_md": false 00:23:07.601 }, 00:23:07.601 "memory_domains": [ 00:23:07.601 { 00:23:07.601 "dma_device_id": "system", 00:23:07.601 "dma_device_type": 1 00:23:07.601 }, 00:23:07.601 { 00:23:07.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.601 "dma_device_type": 2 00:23:07.601 } 00:23:07.601 ], 00:23:07.601 "driver_specific": { 00:23:07.601 "passthru": { 00:23:07.601 "name": "pt1", 00:23:07.601 "base_bdev_name": "malloc1" 00:23:07.601 } 00:23:07.601 } 00:23:07.601 }' 00:23:07.601 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:07.859 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.117 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.117 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:08.117 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:08.117 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:08.117 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:08.375 "name": "pt2", 00:23:08.375 "aliases": [ 00:23:08.375 "00000000-0000-0000-0000-000000000002" 00:23:08.375 ], 00:23:08.375 "product_name": "passthru", 00:23:08.375 "block_size": 512, 00:23:08.375 "num_blocks": 65536, 00:23:08.375 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:08.375 "assigned_rate_limits": { 00:23:08.375 "rw_ios_per_sec": 0, 00:23:08.375 "rw_mbytes_per_sec": 0, 00:23:08.375 "r_mbytes_per_sec": 0, 00:23:08.375 "w_mbytes_per_sec": 0 00:23:08.375 }, 00:23:08.375 "claimed": true, 00:23:08.375 "claim_type": "exclusive_write", 00:23:08.375 "zoned": false, 00:23:08.375 "supported_io_types": { 00:23:08.375 "read": true, 00:23:08.375 "write": true, 00:23:08.375 "unmap": true, 00:23:08.375 "flush": true, 00:23:08.375 "reset": true, 00:23:08.375 "nvme_admin": false, 00:23:08.375 "nvme_io": false, 00:23:08.375 "nvme_io_md": false, 00:23:08.375 "write_zeroes": true, 00:23:08.375 "zcopy": true, 00:23:08.375 "get_zone_info": false, 00:23:08.375 "zone_management": false, 00:23:08.375 "zone_append": false, 00:23:08.375 "compare": false, 00:23:08.375 "compare_and_write": false, 00:23:08.375 "abort": true, 00:23:08.375 "seek_hole": false, 00:23:08.375 "seek_data": false, 00:23:08.375 "copy": true, 00:23:08.375 "nvme_iov_md": false 00:23:08.375 }, 00:23:08.375 "memory_domains": [ 00:23:08.375 { 00:23:08.375 "dma_device_id": "system", 00:23:08.375 "dma_device_type": 1 00:23:08.375 }, 00:23:08.375 { 00:23:08.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:08.375 "dma_device_type": 2 00:23:08.375 } 00:23:08.375 ], 00:23:08.375 "driver_specific": { 00:23:08.375 "passthru": { 00:23:08.375 "name": "pt2", 00:23:08.375 "base_bdev_name": "malloc2" 00:23:08.375 } 00:23:08.375 } 00:23:08.375 }' 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:08.375 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:08.642 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:08.642 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.642 07:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.642 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:08.642 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:08.642 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:08.642 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:08.927 "name": "pt3", 00:23:08.927 "aliases": [ 00:23:08.927 "00000000-0000-0000-0000-000000000003" 00:23:08.927 ], 00:23:08.927 "product_name": "passthru", 00:23:08.927 "block_size": 512, 00:23:08.927 "num_blocks": 65536, 00:23:08.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:08.927 "assigned_rate_limits": { 00:23:08.927 "rw_ios_per_sec": 0, 00:23:08.927 "rw_mbytes_per_sec": 0, 00:23:08.927 "r_mbytes_per_sec": 0, 00:23:08.927 "w_mbytes_per_sec": 0 00:23:08.927 }, 00:23:08.927 "claimed": true, 00:23:08.927 "claim_type": "exclusive_write", 00:23:08.927 "zoned": false, 00:23:08.927 "supported_io_types": { 00:23:08.927 "read": true, 00:23:08.927 "write": true, 00:23:08.927 "unmap": true, 00:23:08.927 "flush": true, 00:23:08.927 "reset": true, 00:23:08.927 "nvme_admin": false, 00:23:08.927 "nvme_io": false, 00:23:08.927 "nvme_io_md": false, 00:23:08.927 "write_zeroes": true, 00:23:08.927 "zcopy": true, 00:23:08.927 "get_zone_info": false, 00:23:08.927 "zone_management": false, 00:23:08.927 "zone_append": false, 00:23:08.927 "compare": false, 00:23:08.927 "compare_and_write": false, 00:23:08.927 "abort": true, 00:23:08.927 "seek_hole": false, 00:23:08.927 "seek_data": false, 00:23:08.927 "copy": true, 00:23:08.927 "nvme_iov_md": false 00:23:08.927 }, 00:23:08.927 "memory_domains": [ 00:23:08.927 { 00:23:08.927 "dma_device_id": "system", 00:23:08.927 "dma_device_type": 1 00:23:08.927 }, 00:23:08.927 { 00:23:08.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:08.927 "dma_device_type": 2 00:23:08.927 } 00:23:08.927 ], 00:23:08.927 "driver_specific": { 00:23:08.927 "passthru": { 00:23:08.927 "name": "pt3", 00:23:08.927 "base_bdev_name": "malloc3" 00:23:08.927 } 00:23:08.927 } 00:23:08.927 }' 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:08.927 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:09.200 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:09.458 "name": "pt4", 00:23:09.458 "aliases": [ 00:23:09.458 "00000000-0000-0000-0000-000000000004" 00:23:09.458 ], 00:23:09.458 "product_name": "passthru", 00:23:09.458 "block_size": 512, 00:23:09.458 "num_blocks": 65536, 00:23:09.458 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:09.458 "assigned_rate_limits": { 00:23:09.458 "rw_ios_per_sec": 0, 00:23:09.458 "rw_mbytes_per_sec": 0, 00:23:09.458 "r_mbytes_per_sec": 0, 00:23:09.458 "w_mbytes_per_sec": 0 00:23:09.458 }, 00:23:09.458 "claimed": true, 00:23:09.458 "claim_type": "exclusive_write", 00:23:09.458 "zoned": false, 00:23:09.458 "supported_io_types": { 00:23:09.458 "read": true, 00:23:09.458 "write": true, 00:23:09.458 "unmap": true, 00:23:09.458 "flush": true, 00:23:09.458 "reset": true, 00:23:09.458 "nvme_admin": false, 00:23:09.458 "nvme_io": false, 00:23:09.458 "nvme_io_md": false, 00:23:09.458 "write_zeroes": true, 00:23:09.458 "zcopy": true, 00:23:09.458 "get_zone_info": false, 00:23:09.458 "zone_management": false, 00:23:09.458 "zone_append": false, 00:23:09.458 "compare": false, 00:23:09.458 "compare_and_write": false, 00:23:09.458 "abort": true, 00:23:09.458 "seek_hole": false, 00:23:09.458 "seek_data": false, 00:23:09.458 "copy": true, 00:23:09.458 "nvme_iov_md": false 00:23:09.458 }, 00:23:09.458 "memory_domains": [ 00:23:09.458 { 00:23:09.458 "dma_device_id": "system", 00:23:09.458 "dma_device_type": 1 00:23:09.458 }, 00:23:09.458 { 00:23:09.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:09.458 "dma_device_type": 2 00:23:09.458 } 00:23:09.458 ], 00:23:09.458 "driver_specific": { 00:23:09.458 "passthru": { 00:23:09.458 "name": "pt4", 00:23:09.458 "base_bdev_name": "malloc4" 00:23:09.458 } 00:23:09.458 } 00:23:09.458 }' 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:09.458 07:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:09.716 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:23:09.974 [2024-07-25 07:28:42.337503] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:09.974 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ddcecdd8-f9bf-44b8-8cc5-946205984c83 '!=' ddcecdd8-f9bf-44b8-8cc5-946205984c83 ']' 00:23:09.974 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:23:09.974 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:09.974 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:09.974 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:10.232 [2024-07-25 07:28:42.565846] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.232 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.491 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:10.491 "name": "raid_bdev1", 00:23:10.491 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:10.491 "strip_size_kb": 0, 00:23:10.491 "state": "online", 00:23:10.491 "raid_level": "raid1", 00:23:10.491 "superblock": true, 00:23:10.491 "num_base_bdevs": 4, 00:23:10.491 "num_base_bdevs_discovered": 3, 00:23:10.491 "num_base_bdevs_operational": 3, 00:23:10.491 "base_bdevs_list": [ 00:23:10.491 { 00:23:10.491 "name": null, 00:23:10.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.491 "is_configured": false, 00:23:10.491 "data_offset": 2048, 00:23:10.491 "data_size": 63488 00:23:10.491 }, 00:23:10.491 { 00:23:10.491 "name": "pt2", 00:23:10.491 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:10.491 "is_configured": true, 00:23:10.491 "data_offset": 2048, 00:23:10.491 "data_size": 63488 00:23:10.491 }, 00:23:10.491 { 00:23:10.491 "name": "pt3", 00:23:10.491 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:10.491 "is_configured": true, 00:23:10.491 "data_offset": 2048, 00:23:10.491 "data_size": 63488 00:23:10.491 }, 00:23:10.491 { 00:23:10.491 "name": "pt4", 00:23:10.491 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:10.491 "is_configured": true, 00:23:10.491 "data_offset": 2048, 00:23:10.491 "data_size": 63488 00:23:10.491 } 00:23:10.491 ] 00:23:10.491 }' 00:23:10.491 07:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:10.491 07:28:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:11.053 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:11.053 [2024-07-25 07:28:43.564455] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:11.054 [2024-07-25 07:28:43.564480] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:11.054 [2024-07-25 07:28:43.564528] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:11.054 [2024-07-25 07:28:43.564592] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:11.054 [2024-07-25 07:28:43.564603] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x230e740 name raid_bdev1, state offline 00:23:11.054 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.054 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:23:11.310 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:23:11.310 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:23:11.310 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:23:11.310 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:11.310 07:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:11.568 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:23:11.568 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:11.568 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:11.826 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:23:11.826 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:11.826 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:12.084 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:23:12.084 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:12.084 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:23:12.084 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:23:12.084 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:12.342 [2024-07-25 07:28:44.699395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:12.342 [2024-07-25 07:28:44.699436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.342 [2024-07-25 07:28:44.699451] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2315780 00:23:12.342 [2024-07-25 07:28:44.699462] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.342 [2024-07-25 07:28:44.700933] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.342 [2024-07-25 07:28:44.700963] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:12.342 [2024-07-25 07:28:44.701019] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:12.342 [2024-07-25 07:28:44.701044] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:12.342 pt2 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.342 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.601 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.601 "name": "raid_bdev1", 00:23:12.601 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:12.601 "strip_size_kb": 0, 00:23:12.601 "state": "configuring", 00:23:12.601 "raid_level": "raid1", 00:23:12.601 "superblock": true, 00:23:12.601 "num_base_bdevs": 4, 00:23:12.601 "num_base_bdevs_discovered": 1, 00:23:12.601 "num_base_bdevs_operational": 3, 00:23:12.601 "base_bdevs_list": [ 00:23:12.601 { 00:23:12.601 "name": null, 00:23:12.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.601 "is_configured": false, 00:23:12.601 "data_offset": 2048, 00:23:12.601 "data_size": 63488 00:23:12.601 }, 00:23:12.601 { 00:23:12.601 "name": "pt2", 00:23:12.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:12.601 "is_configured": true, 00:23:12.601 "data_offset": 2048, 00:23:12.601 "data_size": 63488 00:23:12.601 }, 00:23:12.601 { 00:23:12.601 "name": null, 00:23:12.601 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:12.601 "is_configured": false, 00:23:12.601 "data_offset": 2048, 00:23:12.601 "data_size": 63488 00:23:12.601 }, 00:23:12.601 { 00:23:12.601 "name": null, 00:23:12.601 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:12.601 "is_configured": false, 00:23:12.601 "data_offset": 2048, 00:23:12.601 "data_size": 63488 00:23:12.601 } 00:23:12.601 ] 00:23:12.601 }' 00:23:12.601 07:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.601 07:28:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:13.167 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:23:13.167 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:23:13.167 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:13.425 [2024-07-25 07:28:45.726110] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:13.425 [2024-07-25 07:28:45.726160] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.425 [2024-07-25 07:28:45.726178] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2315520 00:23:13.425 [2024-07-25 07:28:45.726189] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.425 [2024-07-25 07:28:45.726493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.425 [2024-07-25 07:28:45.726509] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:13.425 [2024-07-25 07:28:45.726560] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:13.425 [2024-07-25 07:28:45.726577] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:13.425 pt3 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.425 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.684 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.684 "name": "raid_bdev1", 00:23:13.684 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:13.684 "strip_size_kb": 0, 00:23:13.684 "state": "configuring", 00:23:13.684 "raid_level": "raid1", 00:23:13.684 "superblock": true, 00:23:13.684 "num_base_bdevs": 4, 00:23:13.684 "num_base_bdevs_discovered": 2, 00:23:13.684 "num_base_bdevs_operational": 3, 00:23:13.684 "base_bdevs_list": [ 00:23:13.684 { 00:23:13.684 "name": null, 00:23:13.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.684 "is_configured": false, 00:23:13.684 "data_offset": 2048, 00:23:13.684 "data_size": 63488 00:23:13.684 }, 00:23:13.684 { 00:23:13.684 "name": "pt2", 00:23:13.684 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:13.684 "is_configured": true, 00:23:13.684 "data_offset": 2048, 00:23:13.684 "data_size": 63488 00:23:13.684 }, 00:23:13.684 { 00:23:13.684 "name": "pt3", 00:23:13.684 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:13.684 "is_configured": true, 00:23:13.684 "data_offset": 2048, 00:23:13.684 "data_size": 63488 00:23:13.684 }, 00:23:13.684 { 00:23:13.684 "name": null, 00:23:13.684 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:13.684 "is_configured": false, 00:23:13.684 "data_offset": 2048, 00:23:13.684 "data_size": 63488 00:23:13.684 } 00:23:13.684 ] 00:23:13.684 }' 00:23:13.684 07:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.684 07:28:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=3 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:14.251 [2024-07-25 07:28:46.724906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:14.251 [2024-07-25 07:28:46.724952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:14.251 [2024-07-25 07:28:46.724968] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2295fa0 00:23:14.251 [2024-07-25 07:28:46.724979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:14.251 [2024-07-25 07:28:46.725274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:14.251 [2024-07-25 07:28:46.725289] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:14.251 [2024-07-25 07:28:46.725341] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:14.251 [2024-07-25 07:28:46.725358] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:14.251 [2024-07-25 07:28:46.725458] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2164650 00:23:14.251 [2024-07-25 07:28:46.725468] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:14.251 [2024-07-25 07:28:46.725620] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2315a10 00:23:14.251 [2024-07-25 07:28:46.725740] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2164650 00:23:14.251 [2024-07-25 07:28:46.725749] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2164650 00:23:14.251 [2024-07-25 07:28:46.725836] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.251 pt4 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.251 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.510 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.510 "name": "raid_bdev1", 00:23:14.510 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:14.510 "strip_size_kb": 0, 00:23:14.510 "state": "online", 00:23:14.510 "raid_level": "raid1", 00:23:14.510 "superblock": true, 00:23:14.510 "num_base_bdevs": 4, 00:23:14.510 "num_base_bdevs_discovered": 3, 00:23:14.510 "num_base_bdevs_operational": 3, 00:23:14.510 "base_bdevs_list": [ 00:23:14.510 { 00:23:14.510 "name": null, 00:23:14.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.510 "is_configured": false, 00:23:14.510 "data_offset": 2048, 00:23:14.510 "data_size": 63488 00:23:14.510 }, 00:23:14.510 { 00:23:14.510 "name": "pt2", 00:23:14.510 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:14.510 "is_configured": true, 00:23:14.510 "data_offset": 2048, 00:23:14.510 "data_size": 63488 00:23:14.510 }, 00:23:14.510 { 00:23:14.510 "name": "pt3", 00:23:14.510 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:14.510 "is_configured": true, 00:23:14.510 "data_offset": 2048, 00:23:14.510 "data_size": 63488 00:23:14.510 }, 00:23:14.510 { 00:23:14.510 "name": "pt4", 00:23:14.510 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:14.510 "is_configured": true, 00:23:14.510 "data_offset": 2048, 00:23:14.510 "data_size": 63488 00:23:14.510 } 00:23:14.510 ] 00:23:14.510 }' 00:23:14.510 07:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.510 07:28:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:15.076 07:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:15.334 [2024-07-25 07:28:47.747594] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:15.334 [2024-07-25 07:28:47.747616] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:15.334 [2024-07-25 07:28:47.747665] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:15.334 [2024-07-25 07:28:47.747728] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:15.334 [2024-07-25 07:28:47.747739] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2164650 name raid_bdev1, state offline 00:23:15.334 07:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:23:15.334 07:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.592 07:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:23:15.592 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:23:15.592 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 4 -gt 2 ']' 00:23:15.592 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=3 00:23:15.592 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:15.850 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:16.109 [2024-07-25 07:28:48.437366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:16.109 [2024-07-25 07:28:48.437401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.109 [2024-07-25 07:28:48.437416] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2295fa0 00:23:16.109 [2024-07-25 07:28:48.437427] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.109 [2024-07-25 07:28:48.438903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.109 [2024-07-25 07:28:48.438929] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:16.109 [2024-07-25 07:28:48.438983] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:16.109 [2024-07-25 07:28:48.439008] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:16.109 [2024-07-25 07:28:48.439095] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:16.109 [2024-07-25 07:28:48.439107] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:16.109 [2024-07-25 07:28:48.439120] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21631c0 name raid_bdev1, state configuring 00:23:16.109 [2024-07-25 07:28:48.439151] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:16.109 [2024-07-25 07:28:48.439223] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:16.109 pt1 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 4 -gt 2 ']' 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.109 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.368 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.368 "name": "raid_bdev1", 00:23:16.368 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:16.368 "strip_size_kb": 0, 00:23:16.368 "state": "configuring", 00:23:16.368 "raid_level": "raid1", 00:23:16.368 "superblock": true, 00:23:16.368 "num_base_bdevs": 4, 00:23:16.368 "num_base_bdevs_discovered": 2, 00:23:16.368 "num_base_bdevs_operational": 3, 00:23:16.368 "base_bdevs_list": [ 00:23:16.368 { 00:23:16.368 "name": null, 00:23:16.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.368 "is_configured": false, 00:23:16.368 "data_offset": 2048, 00:23:16.368 "data_size": 63488 00:23:16.368 }, 00:23:16.368 { 00:23:16.368 "name": "pt2", 00:23:16.368 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:16.368 "is_configured": true, 00:23:16.368 "data_offset": 2048, 00:23:16.368 "data_size": 63488 00:23:16.368 }, 00:23:16.368 { 00:23:16.368 "name": "pt3", 00:23:16.368 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:16.368 "is_configured": true, 00:23:16.368 "data_offset": 2048, 00:23:16.368 "data_size": 63488 00:23:16.368 }, 00:23:16.368 { 00:23:16.368 "name": null, 00:23:16.368 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:16.368 "is_configured": false, 00:23:16.368 "data_offset": 2048, 00:23:16.368 "data_size": 63488 00:23:16.368 } 00:23:16.368 ] 00:23:16.368 }' 00:23:16.368 07:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.368 07:28:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:16.934 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:23:16.934 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:17.192 [2024-07-25 07:28:49.688676] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:17.192 [2024-07-25 07:28:49.688723] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:17.192 [2024-07-25 07:28:49.688741] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21637d0 00:23:17.192 [2024-07-25 07:28:49.688753] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:17.192 [2024-07-25 07:28:49.689063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:17.192 [2024-07-25 07:28:49.689078] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:17.192 [2024-07-25 07:28:49.689131] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:17.192 [2024-07-25 07:28:49.689159] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:17.192 [2024-07-25 07:28:49.689264] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2163020 00:23:17.192 [2024-07-25 07:28:49.689274] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:17.192 [2024-07-25 07:28:49.689427] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2315a10 00:23:17.192 [2024-07-25 07:28:49.689546] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2163020 00:23:17.192 [2024-07-25 07:28:49.689555] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2163020 00:23:17.192 [2024-07-25 07:28:49.689641] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.192 pt4 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.192 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.450 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.450 "name": "raid_bdev1", 00:23:17.450 "uuid": "ddcecdd8-f9bf-44b8-8cc5-946205984c83", 00:23:17.450 "strip_size_kb": 0, 00:23:17.450 "state": "online", 00:23:17.450 "raid_level": "raid1", 00:23:17.450 "superblock": true, 00:23:17.450 "num_base_bdevs": 4, 00:23:17.450 "num_base_bdevs_discovered": 3, 00:23:17.450 "num_base_bdevs_operational": 3, 00:23:17.450 "base_bdevs_list": [ 00:23:17.450 { 00:23:17.450 "name": null, 00:23:17.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.450 "is_configured": false, 00:23:17.450 "data_offset": 2048, 00:23:17.450 "data_size": 63488 00:23:17.450 }, 00:23:17.450 { 00:23:17.450 "name": "pt2", 00:23:17.450 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:17.450 "is_configured": true, 00:23:17.450 "data_offset": 2048, 00:23:17.450 "data_size": 63488 00:23:17.450 }, 00:23:17.450 { 00:23:17.450 "name": "pt3", 00:23:17.450 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:17.450 "is_configured": true, 00:23:17.450 "data_offset": 2048, 00:23:17.450 "data_size": 63488 00:23:17.450 }, 00:23:17.450 { 00:23:17.450 "name": "pt4", 00:23:17.450 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:17.450 "is_configured": true, 00:23:17.450 "data_offset": 2048, 00:23:17.450 "data_size": 63488 00:23:17.450 } 00:23:17.450 ] 00:23:17.450 }' 00:23:17.450 07:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.450 07:28:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.016 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:18.016 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:18.274 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:23:18.274 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:18.274 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:23:18.533 [2024-07-25 07:28:50.948238] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' ddcecdd8-f9bf-44b8-8cc5-946205984c83 '!=' ddcecdd8-f9bf-44b8-8cc5-946205984c83 ']' 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1707390 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1707390 ']' 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1707390 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:18.533 07:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1707390 00:23:18.533 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:18.533 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:18.533 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1707390' 00:23:18.533 killing process with pid 1707390 00:23:18.533 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1707390 00:23:18.533 [2024-07-25 07:28:51.024667] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:18.533 [2024-07-25 07:28:51.024717] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:18.533 [2024-07-25 07:28:51.024775] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:18.533 [2024-07-25 07:28:51.024786] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2163020 name raid_bdev1, state offline 00:23:18.533 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1707390 00:23:18.533 [2024-07-25 07:28:51.056882] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:18.792 07:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:23:18.792 00:23:18.792 real 0m23.734s 00:23:18.792 user 0m43.380s 00:23:18.792 sys 0m4.312s 00:23:18.792 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:18.792 07:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.792 ************************************ 00:23:18.792 END TEST raid_superblock_test 00:23:18.792 ************************************ 00:23:18.792 07:28:51 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:23:18.792 07:28:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:18.792 07:28:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:18.792 07:28:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:19.051 ************************************ 00:23:19.051 START TEST raid_read_error_test 00:23:19.051 ************************************ 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 read 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.7YdwwYwdVp 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1711871 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1711871 /var/tmp/spdk-raid.sock 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1711871 ']' 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:19.051 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:19.051 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:19.052 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:19.052 07:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.052 [2024-07-25 07:28:51.400633] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:23:19.052 [2024-07-25 07:28:51.400690] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711871 ] 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:19.052 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:19.052 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:19.052 [2024-07-25 07:28:51.531236] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:19.310 [2024-07-25 07:28:51.615291] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:23:19.310 [2024-07-25 07:28:51.674205] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:19.310 [2024-07-25 07:28:51.674240] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:19.877 07:28:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:19.877 07:28:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:23:19.877 07:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:19.877 07:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:20.135 BaseBdev1_malloc 00:23:20.135 07:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:20.394 true 00:23:20.394 07:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:20.394 [2024-07-25 07:28:52.902963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:20.394 [2024-07-25 07:28:52.903004] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:20.394 [2024-07-25 07:28:52.903023] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xffaa50 00:23:20.394 [2024-07-25 07:28:52.903034] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:20.394 [2024-07-25 07:28:52.904513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:20.394 [2024-07-25 07:28:52.904541] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:20.394 BaseBdev1 00:23:20.394 07:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:20.394 07:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:20.653 BaseBdev2_malloc 00:23:20.653 07:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:20.911 true 00:23:20.911 07:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:21.170 [2024-07-25 07:28:53.601265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:21.170 [2024-07-25 07:28:53.601304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:21.170 [2024-07-25 07:28:53.601323] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a3f40 00:23:21.170 [2024-07-25 07:28:53.601335] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:21.170 [2024-07-25 07:28:53.602713] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:21.170 [2024-07-25 07:28:53.602740] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:21.170 BaseBdev2 00:23:21.170 07:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:21.170 07:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:21.428 BaseBdev3_malloc 00:23:21.429 07:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:21.701 true 00:23:21.701 07:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:21.972 [2024-07-25 07:28:54.275244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:21.972 [2024-07-25 07:28:54.275282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:21.972 [2024-07-25 07:28:54.275301] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a7250 00:23:21.972 [2024-07-25 07:28:54.275313] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:21.972 [2024-07-25 07:28:54.276665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:21.972 [2024-07-25 07:28:54.276691] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:21.972 BaseBdev3 00:23:21.973 07:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:21.973 07:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:22.230 BaseBdev4_malloc 00:23:22.230 07:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:22.230 true 00:23:22.231 07:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:22.489 [2024-07-25 07:28:54.965278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:22.489 [2024-07-25 07:28:54.965319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.489 [2024-07-25 07:28:54.965340] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a8b40 00:23:22.489 [2024-07-25 07:28:54.965351] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.489 [2024-07-25 07:28:54.966747] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.489 [2024-07-25 07:28:54.966773] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:22.489 BaseBdev4 00:23:22.489 07:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:22.747 [2024-07-25 07:28:55.189893] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:22.747 [2024-07-25 07:28:55.191065] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:22.747 [2024-07-25 07:28:55.191129] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:22.747 [2024-07-25 07:28:55.191190] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:22.747 [2024-07-25 07:28:55.191419] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11a55d0 00:23:22.747 [2024-07-25 07:28:55.191430] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:22.747 [2024-07-25 07:28:55.191604] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xff6c70 00:23:22.747 [2024-07-25 07:28:55.191747] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11a55d0 00:23:22.747 [2024-07-25 07:28:55.191756] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11a55d0 00:23:22.747 [2024-07-25 07:28:55.191852] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:22.747 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:22.748 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:22.748 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:22.748 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.748 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.006 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.006 "name": "raid_bdev1", 00:23:23.006 "uuid": "3e6ae5b3-bc5b-49c1-893c-a8bf74ecb83c", 00:23:23.006 "strip_size_kb": 0, 00:23:23.006 "state": "online", 00:23:23.006 "raid_level": "raid1", 00:23:23.006 "superblock": true, 00:23:23.006 "num_base_bdevs": 4, 00:23:23.006 "num_base_bdevs_discovered": 4, 00:23:23.006 "num_base_bdevs_operational": 4, 00:23:23.006 "base_bdevs_list": [ 00:23:23.006 { 00:23:23.006 "name": "BaseBdev1", 00:23:23.006 "uuid": "29998aa9-5f27-5491-8d58-b76452011fa2", 00:23:23.006 "is_configured": true, 00:23:23.006 "data_offset": 2048, 00:23:23.006 "data_size": 63488 00:23:23.006 }, 00:23:23.006 { 00:23:23.006 "name": "BaseBdev2", 00:23:23.006 "uuid": "b7e44e8b-4973-56f1-8fe4-541ab6cd5214", 00:23:23.006 "is_configured": true, 00:23:23.006 "data_offset": 2048, 00:23:23.006 "data_size": 63488 00:23:23.006 }, 00:23:23.006 { 00:23:23.006 "name": "BaseBdev3", 00:23:23.006 "uuid": "e1fa56ab-2e34-5321-8091-ad7eed50f908", 00:23:23.006 "is_configured": true, 00:23:23.006 "data_offset": 2048, 00:23:23.006 "data_size": 63488 00:23:23.006 }, 00:23:23.006 { 00:23:23.006 "name": "BaseBdev4", 00:23:23.006 "uuid": "26cebea9-1ce4-529c-b6b8-739828c6166d", 00:23:23.006 "is_configured": true, 00:23:23.006 "data_offset": 2048, 00:23:23.006 "data_size": 63488 00:23:23.006 } 00:23:23.006 ] 00:23:23.006 }' 00:23:23.006 07:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.006 07:28:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:23.573 07:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:23:23.573 07:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:23.832 [2024-07-25 07:28:56.116609] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x109a210 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.768 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.026 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:25.026 "name": "raid_bdev1", 00:23:25.026 "uuid": "3e6ae5b3-bc5b-49c1-893c-a8bf74ecb83c", 00:23:25.026 "strip_size_kb": 0, 00:23:25.026 "state": "online", 00:23:25.026 "raid_level": "raid1", 00:23:25.026 "superblock": true, 00:23:25.026 "num_base_bdevs": 4, 00:23:25.026 "num_base_bdevs_discovered": 4, 00:23:25.026 "num_base_bdevs_operational": 4, 00:23:25.026 "base_bdevs_list": [ 00:23:25.026 { 00:23:25.026 "name": "BaseBdev1", 00:23:25.026 "uuid": "29998aa9-5f27-5491-8d58-b76452011fa2", 00:23:25.026 "is_configured": true, 00:23:25.026 "data_offset": 2048, 00:23:25.026 "data_size": 63488 00:23:25.026 }, 00:23:25.026 { 00:23:25.026 "name": "BaseBdev2", 00:23:25.026 "uuid": "b7e44e8b-4973-56f1-8fe4-541ab6cd5214", 00:23:25.026 "is_configured": true, 00:23:25.026 "data_offset": 2048, 00:23:25.026 "data_size": 63488 00:23:25.026 }, 00:23:25.026 { 00:23:25.026 "name": "BaseBdev3", 00:23:25.026 "uuid": "e1fa56ab-2e34-5321-8091-ad7eed50f908", 00:23:25.026 "is_configured": true, 00:23:25.026 "data_offset": 2048, 00:23:25.026 "data_size": 63488 00:23:25.026 }, 00:23:25.026 { 00:23:25.026 "name": "BaseBdev4", 00:23:25.026 "uuid": "26cebea9-1ce4-529c-b6b8-739828c6166d", 00:23:25.026 "is_configured": true, 00:23:25.026 "data_offset": 2048, 00:23:25.026 "data_size": 63488 00:23:25.026 } 00:23:25.026 ] 00:23:25.026 }' 00:23:25.026 07:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:25.026 07:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:25.592 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:25.851 [2024-07-25 07:28:58.199271] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:25.851 [2024-07-25 07:28:58.199307] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:25.851 [2024-07-25 07:28:58.202272] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:25.851 [2024-07-25 07:28:58.202309] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:25.851 [2024-07-25 07:28:58.202422] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:25.851 [2024-07-25 07:28:58.202433] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a55d0 name raid_bdev1, state offline 00:23:25.851 0 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1711871 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1711871 ']' 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1711871 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1711871 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1711871' 00:23:25.851 killing process with pid 1711871 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1711871 00:23:25.851 [2024-07-25 07:28:58.276636] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:25.851 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1711871 00:23:25.851 [2024-07-25 07:28:58.304360] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.7YdwwYwdVp 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:26.109 00:23:26.109 real 0m7.180s 00:23:26.109 user 0m11.408s 00:23:26.109 sys 0m1.279s 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:26.109 07:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:26.109 ************************************ 00:23:26.109 END TEST raid_read_error_test 00:23:26.109 ************************************ 00:23:26.109 07:28:58 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:23:26.109 07:28:58 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:26.109 07:28:58 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:26.109 07:28:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:26.109 ************************************ 00:23:26.109 START TEST raid_write_error_test 00:23:26.109 ************************************ 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 write 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.r6psWk0ah4 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1713283 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1713283 /var/tmp/spdk-raid.sock 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1713283 ']' 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:26.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:26.109 07:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:26.367 [2024-07-25 07:28:58.672911] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:23:26.367 [2024-07-25 07:28:58.672970] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713283 ] 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:26.367 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:26.367 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:26.367 [2024-07-25 07:28:58.802888] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:26.367 [2024-07-25 07:28:58.889009] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:23:26.626 [2024-07-25 07:28:58.951854] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:26.626 [2024-07-25 07:28:58.951892] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:27.192 07:28:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:27.192 07:28:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:23:27.192 07:28:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:27.192 07:28:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:27.450 BaseBdev1_malloc 00:23:27.450 07:28:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:27.708 true 00:23:27.708 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:27.967 [2024-07-25 07:29:00.250464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:27.967 [2024-07-25 07:29:00.250510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.967 [2024-07-25 07:29:00.250530] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24fba50 00:23:27.967 [2024-07-25 07:29:00.250542] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.967 [2024-07-25 07:29:00.252033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.967 [2024-07-25 07:29:00.252062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:27.967 BaseBdev1 00:23:27.967 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:27.967 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:27.967 BaseBdev2_malloc 00:23:28.225 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:28.225 true 00:23:28.225 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:28.484 [2024-07-25 07:29:00.936673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:28.484 [2024-07-25 07:29:00.936720] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:28.484 [2024-07-25 07:29:00.936739] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a4f40 00:23:28.484 [2024-07-25 07:29:00.936751] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:28.484 [2024-07-25 07:29:00.938109] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:28.484 [2024-07-25 07:29:00.938137] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:28.484 BaseBdev2 00:23:28.484 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:28.484 07:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:28.742 BaseBdev3_malloc 00:23:28.742 07:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:29.001 true 00:23:29.001 07:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:29.259 [2024-07-25 07:29:01.626665] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:29.259 [2024-07-25 07:29:01.626703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.259 [2024-07-25 07:29:01.626720] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a8250 00:23:29.259 [2024-07-25 07:29:01.626731] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.259 [2024-07-25 07:29:01.627981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.259 [2024-07-25 07:29:01.628007] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:29.259 BaseBdev3 00:23:29.259 07:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:29.259 07:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:29.517 BaseBdev4_malloc 00:23:29.517 07:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:29.775 true 00:23:29.775 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:30.033 [2024-07-25 07:29:02.316562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:30.033 [2024-07-25 07:29:02.316601] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.033 [2024-07-25 07:29:02.316620] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a9b40 00:23:30.033 [2024-07-25 07:29:02.316632] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.033 [2024-07-25 07:29:02.317936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.033 [2024-07-25 07:29:02.317963] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:30.033 BaseBdev4 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:30.033 [2024-07-25 07:29:02.545195] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:30.033 [2024-07-25 07:29:02.546294] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:30.033 [2024-07-25 07:29:02.546357] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:30.033 [2024-07-25 07:29:02.546410] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:30.033 [2024-07-25 07:29:02.546634] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26a65d0 00:23:30.033 [2024-07-25 07:29:02.546645] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:30.033 [2024-07-25 07:29:02.546805] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24f7c70 00:23:30.033 [2024-07-25 07:29:02.546944] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26a65d0 00:23:30.033 [2024-07-25 07:29:02.546954] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26a65d0 00:23:30.033 [2024-07-25 07:29:02.547045] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.033 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.291 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.291 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.291 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.292 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.292 "name": "raid_bdev1", 00:23:30.292 "uuid": "fc685a2f-550d-485d-88bd-3f6020565976", 00:23:30.292 "strip_size_kb": 0, 00:23:30.292 "state": "online", 00:23:30.292 "raid_level": "raid1", 00:23:30.292 "superblock": true, 00:23:30.292 "num_base_bdevs": 4, 00:23:30.292 "num_base_bdevs_discovered": 4, 00:23:30.292 "num_base_bdevs_operational": 4, 00:23:30.292 "base_bdevs_list": [ 00:23:30.292 { 00:23:30.292 "name": "BaseBdev1", 00:23:30.292 "uuid": "c3676383-9ee5-51c6-b29d-0bdd0b1b2579", 00:23:30.292 "is_configured": true, 00:23:30.292 "data_offset": 2048, 00:23:30.292 "data_size": 63488 00:23:30.292 }, 00:23:30.292 { 00:23:30.292 "name": "BaseBdev2", 00:23:30.292 "uuid": "aca9f76f-9570-5a4d-8f5d-4826d55b38f5", 00:23:30.292 "is_configured": true, 00:23:30.292 "data_offset": 2048, 00:23:30.292 "data_size": 63488 00:23:30.292 }, 00:23:30.292 { 00:23:30.292 "name": "BaseBdev3", 00:23:30.292 "uuid": "ff7efd73-e3e9-5ef7-9665-74b0273017b7", 00:23:30.292 "is_configured": true, 00:23:30.292 "data_offset": 2048, 00:23:30.292 "data_size": 63488 00:23:30.292 }, 00:23:30.292 { 00:23:30.292 "name": "BaseBdev4", 00:23:30.292 "uuid": "b1fa9e6e-51ed-5994-bd96-2d070b3b6667", 00:23:30.292 "is_configured": true, 00:23:30.292 "data_offset": 2048, 00:23:30.292 "data_size": 63488 00:23:30.292 } 00:23:30.292 ] 00:23:30.292 }' 00:23:30.292 07:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.292 07:29:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:30.858 07:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:23:30.858 07:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:31.115 [2024-07-25 07:29:03.487905] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259b210 00:23:32.049 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:32.308 [2024-07-25 07:29:04.598203] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:32.308 [2024-07-25 07:29:04.598255] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:32.308 [2024-07-25 07:29:04.598470] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x259b210 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=3 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.308 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.566 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.566 "name": "raid_bdev1", 00:23:32.566 "uuid": "fc685a2f-550d-485d-88bd-3f6020565976", 00:23:32.566 "strip_size_kb": 0, 00:23:32.566 "state": "online", 00:23:32.566 "raid_level": "raid1", 00:23:32.566 "superblock": true, 00:23:32.566 "num_base_bdevs": 4, 00:23:32.566 "num_base_bdevs_discovered": 3, 00:23:32.566 "num_base_bdevs_operational": 3, 00:23:32.566 "base_bdevs_list": [ 00:23:32.566 { 00:23:32.566 "name": null, 00:23:32.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.566 "is_configured": false, 00:23:32.566 "data_offset": 2048, 00:23:32.566 "data_size": 63488 00:23:32.566 }, 00:23:32.566 { 00:23:32.566 "name": "BaseBdev2", 00:23:32.566 "uuid": "aca9f76f-9570-5a4d-8f5d-4826d55b38f5", 00:23:32.566 "is_configured": true, 00:23:32.566 "data_offset": 2048, 00:23:32.566 "data_size": 63488 00:23:32.566 }, 00:23:32.566 { 00:23:32.566 "name": "BaseBdev3", 00:23:32.566 "uuid": "ff7efd73-e3e9-5ef7-9665-74b0273017b7", 00:23:32.566 "is_configured": true, 00:23:32.566 "data_offset": 2048, 00:23:32.566 "data_size": 63488 00:23:32.566 }, 00:23:32.566 { 00:23:32.566 "name": "BaseBdev4", 00:23:32.566 "uuid": "b1fa9e6e-51ed-5994-bd96-2d070b3b6667", 00:23:32.566 "is_configured": true, 00:23:32.566 "data_offset": 2048, 00:23:32.566 "data_size": 63488 00:23:32.566 } 00:23:32.566 ] 00:23:32.566 }' 00:23:32.566 07:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.566 07:29:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:33.134 [2024-07-25 07:29:05.627506] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:33.134 [2024-07-25 07:29:05.627549] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:33.134 [2024-07-25 07:29:05.630444] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:33.134 [2024-07-25 07:29:05.630478] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:33.134 [2024-07-25 07:29:05.630566] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:33.134 [2024-07-25 07:29:05.630576] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26a65d0 name raid_bdev1, state offline 00:23:33.134 0 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1713283 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1713283 ']' 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1713283 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:33.134 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1713283 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1713283' 00:23:33.392 killing process with pid 1713283 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1713283 00:23:33.392 [2024-07-25 07:29:05.700009] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1713283 00:23:33.392 [2024-07-25 07:29:05.727177] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.r6psWk0ah4 00:23:33.392 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:33.651 00:23:33.651 real 0m7.338s 00:23:33.651 user 0m11.698s 00:23:33.651 sys 0m1.294s 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:33.651 07:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:33.651 ************************************ 00:23:33.651 END TEST raid_write_error_test 00:23:33.651 ************************************ 00:23:33.651 07:29:05 bdev_raid -- bdev/bdev_raid.sh@955 -- # '[' true = true ']' 00:23:33.651 07:29:05 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:23:33.651 07:29:05 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:33.651 07:29:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:23:33.651 07:29:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:33.651 07:29:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:33.651 ************************************ 00:23:33.651 START TEST raid_rebuild_test 00:23:33.651 ************************************ 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false false true 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1714449 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1714449 /var/tmp/spdk-raid.sock 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1714449 ']' 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:33.651 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:33.651 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:33.651 [2024-07-25 07:29:06.093391] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:23:33.651 [2024-07-25 07:29:06.093450] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1714449 ] 00:23:33.651 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:33.651 Zero copy mechanism will not be used. 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.651 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:33.651 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:33.652 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:33.652 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:33.910 [2024-07-25 07:29:06.225593] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:33.910 [2024-07-25 07:29:06.311542] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:23:33.910 [2024-07-25 07:29:06.366583] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:33.910 [2024-07-25 07:29:06.366615] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:34.476 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:34.476 07:29:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:23:34.476 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:34.476 07:29:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:34.743 BaseBdev1_malloc 00:23:34.743 07:29:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:35.035 [2024-07-25 07:29:07.430483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:35.035 [2024-07-25 07:29:07.430529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.035 [2024-07-25 07:29:07.430549] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eec690 00:23:35.035 [2024-07-25 07:29:07.430560] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.035 [2024-07-25 07:29:07.431991] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.035 [2024-07-25 07:29:07.432019] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:35.035 BaseBdev1 00:23:35.035 07:29:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:35.035 07:29:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:35.305 BaseBdev2_malloc 00:23:35.305 07:29:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:35.563 [2024-07-25 07:29:07.892041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:35.563 [2024-07-25 07:29:07.892081] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.563 [2024-07-25 07:29:07.892100] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eed050 00:23:35.563 [2024-07-25 07:29:07.892112] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.563 [2024-07-25 07:29:07.893362] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.563 [2024-07-25 07:29:07.893388] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:35.563 BaseBdev2 00:23:35.563 07:29:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:35.821 spare_malloc 00:23:35.821 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:36.080 spare_delay 00:23:36.080 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:36.080 [2024-07-25 07:29:08.586018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:36.080 [2024-07-25 07:29:08.586056] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:36.080 [2024-07-25 07:29:08.586073] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f8d810 00:23:36.080 [2024-07-25 07:29:08.586084] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:36.080 [2024-07-25 07:29:08.587378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:36.080 [2024-07-25 07:29:08.587405] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:36.080 spare 00:23:36.080 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:36.339 [2024-07-25 07:29:08.806619] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:36.339 [2024-07-25 07:29:08.807707] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:36.339 [2024-07-25 07:29:08.807783] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f8c6c0 00:23:36.339 [2024-07-25 07:29:08.807794] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:36.339 [2024-07-25 07:29:08.807969] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ee4a70 00:23:36.339 [2024-07-25 07:29:08.808098] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f8c6c0 00:23:36.339 [2024-07-25 07:29:08.808107] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f8c6c0 00:23:36.339 [2024-07-25 07:29:08.808211] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.340 07:29:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.599 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.599 "name": "raid_bdev1", 00:23:36.599 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:36.599 "strip_size_kb": 0, 00:23:36.599 "state": "online", 00:23:36.599 "raid_level": "raid1", 00:23:36.599 "superblock": false, 00:23:36.599 "num_base_bdevs": 2, 00:23:36.599 "num_base_bdevs_discovered": 2, 00:23:36.599 "num_base_bdevs_operational": 2, 00:23:36.599 "base_bdevs_list": [ 00:23:36.599 { 00:23:36.599 "name": "BaseBdev1", 00:23:36.599 "uuid": "c7afb70d-c333-5689-b64d-ff58acd808ab", 00:23:36.599 "is_configured": true, 00:23:36.599 "data_offset": 0, 00:23:36.599 "data_size": 65536 00:23:36.599 }, 00:23:36.599 { 00:23:36.599 "name": "BaseBdev2", 00:23:36.599 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:36.599 "is_configured": true, 00:23:36.599 "data_offset": 0, 00:23:36.599 "data_size": 65536 00:23:36.599 } 00:23:36.599 ] 00:23:36.599 }' 00:23:36.599 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.599 07:29:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:37.165 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:37.165 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:23:37.424 [2024-07-25 07:29:09.849592] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:37.424 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:23:37.424 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.424 07:29:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:37.682 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:37.940 [2024-07-25 07:29:10.310615] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f98970 00:23:37.940 /dev/nbd0 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:37.940 1+0 records in 00:23:37.940 1+0 records out 00:23:37.940 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274139 s, 14.9 MB/s 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:23:37.940 07:29:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:43.202 65536+0 records in 00:23:43.202 65536+0 records out 00:23:43.202 33554432 bytes (34 MB, 32 MiB) copied, 4.30871 s, 7.8 MB/s 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:43.202 [2024-07-25 07:29:14.923826] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:43.202 07:29:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:43.202 [2024-07-25 07:29:15.128463] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.202 "name": "raid_bdev1", 00:23:43.202 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:43.202 "strip_size_kb": 0, 00:23:43.202 "state": "online", 00:23:43.202 "raid_level": "raid1", 00:23:43.202 "superblock": false, 00:23:43.202 "num_base_bdevs": 2, 00:23:43.202 "num_base_bdevs_discovered": 1, 00:23:43.202 "num_base_bdevs_operational": 1, 00:23:43.202 "base_bdevs_list": [ 00:23:43.202 { 00:23:43.202 "name": null, 00:23:43.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.202 "is_configured": false, 00:23:43.202 "data_offset": 0, 00:23:43.202 "data_size": 65536 00:23:43.202 }, 00:23:43.202 { 00:23:43.202 "name": "BaseBdev2", 00:23:43.202 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:43.202 "is_configured": true, 00:23:43.202 "data_offset": 0, 00:23:43.202 "data_size": 65536 00:23:43.202 } 00:23:43.202 ] 00:23:43.202 }' 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.202 07:29:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:43.460 07:29:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:43.718 [2024-07-25 07:29:16.159197] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:43.718 [2024-07-25 07:29:16.163967] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f98850 00:23:43.718 [2024-07-25 07:29:16.166087] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:43.718 07:29:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:44.650 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:44.650 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:44.650 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:44.650 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:44.650 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:44.908 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.908 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.908 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:44.908 "name": "raid_bdev1", 00:23:44.908 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:44.908 "strip_size_kb": 0, 00:23:44.908 "state": "online", 00:23:44.908 "raid_level": "raid1", 00:23:44.908 "superblock": false, 00:23:44.908 "num_base_bdevs": 2, 00:23:44.908 "num_base_bdevs_discovered": 2, 00:23:44.908 "num_base_bdevs_operational": 2, 00:23:44.908 "process": { 00:23:44.908 "type": "rebuild", 00:23:44.908 "target": "spare", 00:23:44.908 "progress": { 00:23:44.908 "blocks": 24576, 00:23:44.908 "percent": 37 00:23:44.908 } 00:23:44.908 }, 00:23:44.908 "base_bdevs_list": [ 00:23:44.908 { 00:23:44.908 "name": "spare", 00:23:44.908 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:44.908 "is_configured": true, 00:23:44.908 "data_offset": 0, 00:23:44.908 "data_size": 65536 00:23:44.908 }, 00:23:44.908 { 00:23:44.908 "name": "BaseBdev2", 00:23:44.908 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:44.908 "is_configured": true, 00:23:44.908 "data_offset": 0, 00:23:44.908 "data_size": 65536 00:23:44.908 } 00:23:44.908 ] 00:23:44.908 }' 00:23:44.908 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:45.166 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:45.166 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:45.166 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:45.166 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:45.424 [2024-07-25 07:29:17.712417] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:45.424 [2024-07-25 07:29:17.777893] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:45.424 [2024-07-25 07:29:17.777939] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:45.424 [2024-07-25 07:29:17.777953] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:45.424 [2024-07-25 07:29:17.777961] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.424 07:29:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.682 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.682 "name": "raid_bdev1", 00:23:45.682 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:45.682 "strip_size_kb": 0, 00:23:45.682 "state": "online", 00:23:45.682 "raid_level": "raid1", 00:23:45.682 "superblock": false, 00:23:45.682 "num_base_bdevs": 2, 00:23:45.682 "num_base_bdevs_discovered": 1, 00:23:45.682 "num_base_bdevs_operational": 1, 00:23:45.682 "base_bdevs_list": [ 00:23:45.682 { 00:23:45.682 "name": null, 00:23:45.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.682 "is_configured": false, 00:23:45.682 "data_offset": 0, 00:23:45.682 "data_size": 65536 00:23:45.682 }, 00:23:45.682 { 00:23:45.682 "name": "BaseBdev2", 00:23:45.682 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:45.682 "is_configured": true, 00:23:45.682 "data_offset": 0, 00:23:45.682 "data_size": 65536 00:23:45.682 } 00:23:45.682 ] 00:23:45.682 }' 00:23:45.682 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.682 07:29:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.248 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.506 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:46.506 "name": "raid_bdev1", 00:23:46.506 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:46.506 "strip_size_kb": 0, 00:23:46.506 "state": "online", 00:23:46.506 "raid_level": "raid1", 00:23:46.506 "superblock": false, 00:23:46.506 "num_base_bdevs": 2, 00:23:46.506 "num_base_bdevs_discovered": 1, 00:23:46.506 "num_base_bdevs_operational": 1, 00:23:46.506 "base_bdevs_list": [ 00:23:46.506 { 00:23:46.506 "name": null, 00:23:46.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.506 "is_configured": false, 00:23:46.506 "data_offset": 0, 00:23:46.506 "data_size": 65536 00:23:46.506 }, 00:23:46.506 { 00:23:46.506 "name": "BaseBdev2", 00:23:46.506 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:46.506 "is_configured": true, 00:23:46.506 "data_offset": 0, 00:23:46.506 "data_size": 65536 00:23:46.506 } 00:23:46.506 ] 00:23:46.506 }' 00:23:46.506 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:46.506 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:46.506 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:46.507 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:46.507 07:29:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:46.765 [2024-07-25 07:29:19.117898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.765 [2024-07-25 07:29:19.122634] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f98850 00:23:46.765 [2024-07-25 07:29:19.124066] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:46.765 07:29:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.700 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.958 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.958 "name": "raid_bdev1", 00:23:47.958 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:47.958 "strip_size_kb": 0, 00:23:47.958 "state": "online", 00:23:47.958 "raid_level": "raid1", 00:23:47.958 "superblock": false, 00:23:47.958 "num_base_bdevs": 2, 00:23:47.958 "num_base_bdevs_discovered": 2, 00:23:47.958 "num_base_bdevs_operational": 2, 00:23:47.958 "process": { 00:23:47.958 "type": "rebuild", 00:23:47.958 "target": "spare", 00:23:47.958 "progress": { 00:23:47.958 "blocks": 24576, 00:23:47.958 "percent": 37 00:23:47.958 } 00:23:47.958 }, 00:23:47.958 "base_bdevs_list": [ 00:23:47.958 { 00:23:47.958 "name": "spare", 00:23:47.958 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:47.958 "is_configured": true, 00:23:47.958 "data_offset": 0, 00:23:47.958 "data_size": 65536 00:23:47.958 }, 00:23:47.958 { 00:23:47.958 "name": "BaseBdev2", 00:23:47.958 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:47.958 "is_configured": true, 00:23:47.958 "data_offset": 0, 00:23:47.958 "data_size": 65536 00:23:47.958 } 00:23:47.958 ] 00:23:47.958 }' 00:23:47.958 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.958 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=738 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.959 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.217 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.217 "name": "raid_bdev1", 00:23:48.217 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:48.217 "strip_size_kb": 0, 00:23:48.217 "state": "online", 00:23:48.217 "raid_level": "raid1", 00:23:48.217 "superblock": false, 00:23:48.217 "num_base_bdevs": 2, 00:23:48.217 "num_base_bdevs_discovered": 2, 00:23:48.217 "num_base_bdevs_operational": 2, 00:23:48.217 "process": { 00:23:48.217 "type": "rebuild", 00:23:48.217 "target": "spare", 00:23:48.217 "progress": { 00:23:48.217 "blocks": 30720, 00:23:48.217 "percent": 46 00:23:48.217 } 00:23:48.217 }, 00:23:48.217 "base_bdevs_list": [ 00:23:48.217 { 00:23:48.217 "name": "spare", 00:23:48.217 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:48.217 "is_configured": true, 00:23:48.217 "data_offset": 0, 00:23:48.217 "data_size": 65536 00:23:48.217 }, 00:23:48.217 { 00:23:48.217 "name": "BaseBdev2", 00:23:48.217 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:48.217 "is_configured": true, 00:23:48.217 "data_offset": 0, 00:23:48.217 "data_size": 65536 00:23:48.217 } 00:23:48.217 ] 00:23:48.217 }' 00:23:48.217 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.217 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:48.217 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.475 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:48.475 07:29:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.409 07:29:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.667 07:29:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.667 "name": "raid_bdev1", 00:23:49.667 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:49.667 "strip_size_kb": 0, 00:23:49.667 "state": "online", 00:23:49.667 "raid_level": "raid1", 00:23:49.667 "superblock": false, 00:23:49.667 "num_base_bdevs": 2, 00:23:49.667 "num_base_bdevs_discovered": 2, 00:23:49.667 "num_base_bdevs_operational": 2, 00:23:49.667 "process": { 00:23:49.667 "type": "rebuild", 00:23:49.667 "target": "spare", 00:23:49.667 "progress": { 00:23:49.667 "blocks": 57344, 00:23:49.667 "percent": 87 00:23:49.667 } 00:23:49.667 }, 00:23:49.667 "base_bdevs_list": [ 00:23:49.667 { 00:23:49.667 "name": "spare", 00:23:49.667 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:49.667 "is_configured": true, 00:23:49.667 "data_offset": 0, 00:23:49.667 "data_size": 65536 00:23:49.667 }, 00:23:49.667 { 00:23:49.667 "name": "BaseBdev2", 00:23:49.667 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:49.667 "is_configured": true, 00:23:49.667 "data_offset": 0, 00:23:49.667 "data_size": 65536 00:23:49.667 } 00:23:49.667 ] 00:23:49.667 }' 00:23:49.667 07:29:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.667 07:29:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:49.667 07:29:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.667 07:29:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:49.667 07:29:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:49.925 [2024-07-25 07:29:22.347344] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:49.925 [2024-07-25 07:29:22.347401] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:49.925 [2024-07-25 07:29:22.347440] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.860 "name": "raid_bdev1", 00:23:50.860 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:50.860 "strip_size_kb": 0, 00:23:50.860 "state": "online", 00:23:50.860 "raid_level": "raid1", 00:23:50.860 "superblock": false, 00:23:50.860 "num_base_bdevs": 2, 00:23:50.860 "num_base_bdevs_discovered": 2, 00:23:50.860 "num_base_bdevs_operational": 2, 00:23:50.860 "base_bdevs_list": [ 00:23:50.860 { 00:23:50.860 "name": "spare", 00:23:50.860 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:50.860 "is_configured": true, 00:23:50.860 "data_offset": 0, 00:23:50.860 "data_size": 65536 00:23:50.860 }, 00:23:50.860 { 00:23:50.860 "name": "BaseBdev2", 00:23:50.860 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:50.860 "is_configured": true, 00:23:50.860 "data_offset": 0, 00:23:50.860 "data_size": 65536 00:23:50.860 } 00:23:50.860 ] 00:23:50.860 }' 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:50.860 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.118 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.414 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.414 "name": "raid_bdev1", 00:23:51.414 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:51.414 "strip_size_kb": 0, 00:23:51.414 "state": "online", 00:23:51.414 "raid_level": "raid1", 00:23:51.414 "superblock": false, 00:23:51.414 "num_base_bdevs": 2, 00:23:51.414 "num_base_bdevs_discovered": 2, 00:23:51.414 "num_base_bdevs_operational": 2, 00:23:51.414 "base_bdevs_list": [ 00:23:51.414 { 00:23:51.414 "name": "spare", 00:23:51.414 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:51.414 "is_configured": true, 00:23:51.414 "data_offset": 0, 00:23:51.415 "data_size": 65536 00:23:51.415 }, 00:23:51.415 { 00:23:51.415 "name": "BaseBdev2", 00:23:51.415 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:51.415 "is_configured": true, 00:23:51.415 "data_offset": 0, 00:23:51.415 "data_size": 65536 00:23:51.415 } 00:23:51.415 ] 00:23:51.415 }' 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.415 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.686 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:51.686 "name": "raid_bdev1", 00:23:51.686 "uuid": "f3f4b1dd-738b-42a0-a1b3-211d4992e0bf", 00:23:51.686 "strip_size_kb": 0, 00:23:51.686 "state": "online", 00:23:51.686 "raid_level": "raid1", 00:23:51.686 "superblock": false, 00:23:51.686 "num_base_bdevs": 2, 00:23:51.686 "num_base_bdevs_discovered": 2, 00:23:51.686 "num_base_bdevs_operational": 2, 00:23:51.686 "base_bdevs_list": [ 00:23:51.686 { 00:23:51.686 "name": "spare", 00:23:51.686 "uuid": "3ba3de87-3d60-5301-85b3-ed0dfdc081ef", 00:23:51.686 "is_configured": true, 00:23:51.686 "data_offset": 0, 00:23:51.686 "data_size": 65536 00:23:51.686 }, 00:23:51.686 { 00:23:51.686 "name": "BaseBdev2", 00:23:51.686 "uuid": "046de4f3-5a64-56fa-b5e5-2d3aa5fb96c0", 00:23:51.686 "is_configured": true, 00:23:51.686 "data_offset": 0, 00:23:51.686 "data_size": 65536 00:23:51.686 } 00:23:51.686 ] 00:23:51.686 }' 00:23:51.686 07:29:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:51.686 07:29:23 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:52.252 07:29:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:52.252 [2024-07-25 07:29:24.777861] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:52.252 [2024-07-25 07:29:24.777887] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:52.252 [2024-07-25 07:29:24.777942] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:52.252 [2024-07-25 07:29:24.777994] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:52.252 [2024-07-25 07:29:24.778005] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f8c6c0 name raid_bdev1, state offline 00:23:52.509 07:29:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.509 07:29:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:23:52.509 07:29:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:52.510 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:52.767 /dev/nbd0 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:52.767 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:52.767 1+0 records in 00:23:52.767 1+0 records out 00:23:52.767 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263271 s, 15.6 MB/s 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:52.768 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:53.025 /dev/nbd1 00:23:53.025 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:53.025 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:53.026 1+0 records in 00:23:53.026 1+0 records out 00:23:53.026 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307969 s, 13.3 MB/s 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:53.026 07:29:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:53.284 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:53.542 07:29:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1714449 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1714449 ']' 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1714449 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1714449 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1714449' 00:23:53.800 killing process with pid 1714449 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1714449 00:23:53.800 Received shutdown signal, test time was about 60.000000 seconds 00:23:53.800 00:23:53.800 Latency(us) 00:23:53.800 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:53.800 =================================================================================================================== 00:23:53.800 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:53.800 [2024-07-25 07:29:26.143677] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:53.800 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1714449 00:23:53.800 [2024-07-25 07:29:26.168125] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:23:54.059 00:23:54.059 real 0m20.333s 00:23:54.059 user 0m27.735s 00:23:54.059 sys 0m4.474s 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:54.059 ************************************ 00:23:54.059 END TEST raid_rebuild_test 00:23:54.059 ************************************ 00:23:54.059 07:29:26 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:54.059 07:29:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:23:54.059 07:29:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:54.059 07:29:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:54.059 ************************************ 00:23:54.059 START TEST raid_rebuild_test_sb 00:23:54.059 ************************************ 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1718117 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1718117 /var/tmp/spdk-raid.sock 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1718117 ']' 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:54.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:54.059 07:29:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:54.059 [2024-07-25 07:29:26.516859] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:23:54.059 [2024-07-25 07:29:26.516918] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1718117 ] 00:23:54.059 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:54.059 Zero copy mechanism will not be used. 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.059 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:54.059 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:54.318 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:54.318 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:54.318 [2024-07-25 07:29:26.650765] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:54.318 [2024-07-25 07:29:26.732564] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:23:54.318 [2024-07-25 07:29:26.793666] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:54.318 [2024-07-25 07:29:26.793702] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:54.884 07:29:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:54.884 07:29:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:23:54.884 07:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:54.884 07:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:55.142 BaseBdev1_malloc 00:23:55.142 07:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:55.400 [2024-07-25 07:29:27.847518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:55.400 [2024-07-25 07:29:27.847566] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.400 [2024-07-25 07:29:27.847586] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d5690 00:23:55.400 [2024-07-25 07:29:27.847597] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.400 [2024-07-25 07:29:27.849003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.400 [2024-07-25 07:29:27.849030] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:55.400 BaseBdev1 00:23:55.400 07:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:55.400 07:29:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:55.658 BaseBdev2_malloc 00:23:55.658 07:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:55.917 [2024-07-25 07:29:28.301018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:55.917 [2024-07-25 07:29:28.301060] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.917 [2024-07-25 07:29:28.301080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d6050 00:23:55.917 [2024-07-25 07:29:28.301096] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.917 [2024-07-25 07:29:28.302371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.917 [2024-07-25 07:29:28.302398] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:55.917 BaseBdev2 00:23:55.917 07:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:56.175 spare_malloc 00:23:56.175 07:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:56.433 spare_delay 00:23:56.433 07:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:56.691 [2024-07-25 07:29:28.975056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:56.691 [2024-07-25 07:29:28.975096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.691 [2024-07-25 07:29:28.975114] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1976810 00:23:56.691 [2024-07-25 07:29:28.975125] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.691 [2024-07-25 07:29:28.976423] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.691 [2024-07-25 07:29:28.976463] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:56.691 spare 00:23:56.691 07:29:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:56.691 [2024-07-25 07:29:29.203875] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:56.691 [2024-07-25 07:29:29.204927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:56.691 [2024-07-25 07:29:29.205082] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19756c0 00:23:56.691 [2024-07-25 07:29:29.205094] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:56.691 [2024-07-25 07:29:29.205262] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18d7780 00:23:56.691 [2024-07-25 07:29:29.205388] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19756c0 00:23:56.691 [2024-07-25 07:29:29.205397] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19756c0 00:23:56.691 [2024-07-25 07:29:29.205482] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:56.691 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:56.691 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.691 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.949 "name": "raid_bdev1", 00:23:56.949 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:23:56.949 "strip_size_kb": 0, 00:23:56.949 "state": "online", 00:23:56.949 "raid_level": "raid1", 00:23:56.949 "superblock": true, 00:23:56.949 "num_base_bdevs": 2, 00:23:56.949 "num_base_bdevs_discovered": 2, 00:23:56.949 "num_base_bdevs_operational": 2, 00:23:56.949 "base_bdevs_list": [ 00:23:56.949 { 00:23:56.949 "name": "BaseBdev1", 00:23:56.949 "uuid": "c23b3358-d77b-5230-8cb3-f2509b6a79a4", 00:23:56.949 "is_configured": true, 00:23:56.949 "data_offset": 2048, 00:23:56.949 "data_size": 63488 00:23:56.949 }, 00:23:56.949 { 00:23:56.949 "name": "BaseBdev2", 00:23:56.949 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:23:56.949 "is_configured": true, 00:23:56.949 "data_offset": 2048, 00:23:56.949 "data_size": 63488 00:23:56.949 } 00:23:56.949 ] 00:23:56.949 }' 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.949 07:29:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:57.514 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:57.514 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:23:57.781 [2024-07-25 07:29:30.238813] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:57.781 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:23:57.781 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.781 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:58.041 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:58.299 [2024-07-25 07:29:30.695994] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1981120 00:23:58.299 /dev/nbd0 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:58.299 1+0 records in 00:23:58.299 1+0 records out 00:23:58.299 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253532 s, 16.2 MB/s 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:23:58.299 07:29:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:03.560 63488+0 records in 00:24:03.560 63488+0 records out 00:24:03.560 32505856 bytes (33 MB, 31 MiB) copied, 4.77188 s, 6.8 MB/s 00:24:03.560 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:03.561 [2024-07-25 07:29:35.777885] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:03.561 07:29:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:03.561 [2024-07-25 07:29:35.994492] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.561 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.819 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.819 "name": "raid_bdev1", 00:24:03.819 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:03.819 "strip_size_kb": 0, 00:24:03.819 "state": "online", 00:24:03.819 "raid_level": "raid1", 00:24:03.819 "superblock": true, 00:24:03.819 "num_base_bdevs": 2, 00:24:03.819 "num_base_bdevs_discovered": 1, 00:24:03.819 "num_base_bdevs_operational": 1, 00:24:03.819 "base_bdevs_list": [ 00:24:03.819 { 00:24:03.819 "name": null, 00:24:03.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.819 "is_configured": false, 00:24:03.819 "data_offset": 2048, 00:24:03.819 "data_size": 63488 00:24:03.820 }, 00:24:03.820 { 00:24:03.820 "name": "BaseBdev2", 00:24:03.820 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:03.820 "is_configured": true, 00:24:03.820 "data_offset": 2048, 00:24:03.820 "data_size": 63488 00:24:03.820 } 00:24:03.820 ] 00:24:03.820 }' 00:24:03.820 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.820 07:29:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:04.386 07:29:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:04.644 [2024-07-25 07:29:37.013204] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:04.644 [2024-07-25 07:29:37.018031] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19810c0 00:24:04.644 [2024-07-25 07:29:37.020149] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:04.644 07:29:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.578 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.835 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.835 "name": "raid_bdev1", 00:24:05.835 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:05.835 "strip_size_kb": 0, 00:24:05.835 "state": "online", 00:24:05.835 "raid_level": "raid1", 00:24:05.835 "superblock": true, 00:24:05.835 "num_base_bdevs": 2, 00:24:05.835 "num_base_bdevs_discovered": 2, 00:24:05.835 "num_base_bdevs_operational": 2, 00:24:05.835 "process": { 00:24:05.835 "type": "rebuild", 00:24:05.835 "target": "spare", 00:24:05.835 "progress": { 00:24:05.835 "blocks": 24576, 00:24:05.835 "percent": 38 00:24:05.835 } 00:24:05.835 }, 00:24:05.835 "base_bdevs_list": [ 00:24:05.835 { 00:24:05.835 "name": "spare", 00:24:05.835 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:05.835 "is_configured": true, 00:24:05.835 "data_offset": 2048, 00:24:05.835 "data_size": 63488 00:24:05.835 }, 00:24:05.835 { 00:24:05.835 "name": "BaseBdev2", 00:24:05.835 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:05.835 "is_configured": true, 00:24:05.835 "data_offset": 2048, 00:24:05.835 "data_size": 63488 00:24:05.835 } 00:24:05.835 ] 00:24:05.835 }' 00:24:05.835 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.835 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:05.835 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.835 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.835 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:06.402 [2024-07-25 07:29:38.840679] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.402 [2024-07-25 07:29:38.934105] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:06.402 [2024-07-25 07:29:38.934153] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.402 [2024-07-25 07:29:38.934168] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.402 [2024-07-25 07:29:38.934176] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.660 07:29:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.919 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.919 "name": "raid_bdev1", 00:24:06.919 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:06.919 "strip_size_kb": 0, 00:24:06.919 "state": "online", 00:24:06.919 "raid_level": "raid1", 00:24:06.919 "superblock": true, 00:24:06.919 "num_base_bdevs": 2, 00:24:06.919 "num_base_bdevs_discovered": 1, 00:24:06.919 "num_base_bdevs_operational": 1, 00:24:06.919 "base_bdevs_list": [ 00:24:06.919 { 00:24:06.919 "name": null, 00:24:06.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.919 "is_configured": false, 00:24:06.919 "data_offset": 2048, 00:24:06.919 "data_size": 63488 00:24:06.919 }, 00:24:06.919 { 00:24:06.919 "name": "BaseBdev2", 00:24:06.919 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:06.919 "is_configured": true, 00:24:06.919 "data_offset": 2048, 00:24:06.919 "data_size": 63488 00:24:06.919 } 00:24:06.919 ] 00:24:06.919 }' 00:24:06.919 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.919 07:29:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.516 07:29:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.516 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.516 "name": "raid_bdev1", 00:24:07.516 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:07.516 "strip_size_kb": 0, 00:24:07.516 "state": "online", 00:24:07.516 "raid_level": "raid1", 00:24:07.516 "superblock": true, 00:24:07.516 "num_base_bdevs": 2, 00:24:07.516 "num_base_bdevs_discovered": 1, 00:24:07.516 "num_base_bdevs_operational": 1, 00:24:07.516 "base_bdevs_list": [ 00:24:07.516 { 00:24:07.516 "name": null, 00:24:07.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.516 "is_configured": false, 00:24:07.516 "data_offset": 2048, 00:24:07.516 "data_size": 63488 00:24:07.516 }, 00:24:07.516 { 00:24:07.516 "name": "BaseBdev2", 00:24:07.516 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:07.516 "is_configured": true, 00:24:07.516 "data_offset": 2048, 00:24:07.516 "data_size": 63488 00:24:07.516 } 00:24:07.516 ] 00:24:07.516 }' 00:24:07.516 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.774 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:07.774 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.774 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:07.774 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:08.032 [2024-07-25 07:29:40.321887] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:08.032 [2024-07-25 07:29:40.326643] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18cc8f0 00:24:08.032 [2024-07-25 07:29:40.327997] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:08.032 07:29:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.965 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.223 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.223 "name": "raid_bdev1", 00:24:09.223 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:09.223 "strip_size_kb": 0, 00:24:09.223 "state": "online", 00:24:09.223 "raid_level": "raid1", 00:24:09.223 "superblock": true, 00:24:09.223 "num_base_bdevs": 2, 00:24:09.223 "num_base_bdevs_discovered": 2, 00:24:09.223 "num_base_bdevs_operational": 2, 00:24:09.223 "process": { 00:24:09.223 "type": "rebuild", 00:24:09.223 "target": "spare", 00:24:09.223 "progress": { 00:24:09.223 "blocks": 24576, 00:24:09.223 "percent": 38 00:24:09.223 } 00:24:09.223 }, 00:24:09.223 "base_bdevs_list": [ 00:24:09.223 { 00:24:09.223 "name": "spare", 00:24:09.223 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:09.223 "is_configured": true, 00:24:09.223 "data_offset": 2048, 00:24:09.223 "data_size": 63488 00:24:09.223 }, 00:24:09.223 { 00:24:09.223 "name": "BaseBdev2", 00:24:09.223 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:09.223 "is_configured": true, 00:24:09.223 "data_offset": 2048, 00:24:09.223 "data_size": 63488 00:24:09.223 } 00:24:09.223 ] 00:24:09.223 }' 00:24:09.223 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:24:09.224 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=759 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.224 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.482 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.482 "name": "raid_bdev1", 00:24:09.482 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:09.482 "strip_size_kb": 0, 00:24:09.482 "state": "online", 00:24:09.482 "raid_level": "raid1", 00:24:09.482 "superblock": true, 00:24:09.482 "num_base_bdevs": 2, 00:24:09.482 "num_base_bdevs_discovered": 2, 00:24:09.482 "num_base_bdevs_operational": 2, 00:24:09.482 "process": { 00:24:09.482 "type": "rebuild", 00:24:09.482 "target": "spare", 00:24:09.482 "progress": { 00:24:09.482 "blocks": 30720, 00:24:09.482 "percent": 48 00:24:09.482 } 00:24:09.482 }, 00:24:09.482 "base_bdevs_list": [ 00:24:09.482 { 00:24:09.482 "name": "spare", 00:24:09.482 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:09.482 "is_configured": true, 00:24:09.482 "data_offset": 2048, 00:24:09.482 "data_size": 63488 00:24:09.482 }, 00:24:09.482 { 00:24:09.482 "name": "BaseBdev2", 00:24:09.482 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:09.482 "is_configured": true, 00:24:09.482 "data_offset": 2048, 00:24:09.482 "data_size": 63488 00:24:09.482 } 00:24:09.482 ] 00:24:09.482 }' 00:24:09.482 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.482 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.482 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.482 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.482 07:29:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.855 07:29:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.855 07:29:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.855 "name": "raid_bdev1", 00:24:10.855 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:10.855 "strip_size_kb": 0, 00:24:10.855 "state": "online", 00:24:10.855 "raid_level": "raid1", 00:24:10.855 "superblock": true, 00:24:10.855 "num_base_bdevs": 2, 00:24:10.855 "num_base_bdevs_discovered": 2, 00:24:10.855 "num_base_bdevs_operational": 2, 00:24:10.855 "process": { 00:24:10.855 "type": "rebuild", 00:24:10.855 "target": "spare", 00:24:10.855 "progress": { 00:24:10.855 "blocks": 57344, 00:24:10.855 "percent": 90 00:24:10.855 } 00:24:10.855 }, 00:24:10.855 "base_bdevs_list": [ 00:24:10.855 { 00:24:10.855 "name": "spare", 00:24:10.855 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:10.855 "is_configured": true, 00:24:10.855 "data_offset": 2048, 00:24:10.855 "data_size": 63488 00:24:10.855 }, 00:24:10.855 { 00:24:10.855 "name": "BaseBdev2", 00:24:10.855 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:10.855 "is_configured": true, 00:24:10.856 "data_offset": 2048, 00:24:10.856 "data_size": 63488 00:24:10.856 } 00:24:10.856 ] 00:24:10.856 }' 00:24:10.856 07:29:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.856 07:29:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.856 07:29:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.856 07:29:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.856 07:29:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:11.113 [2024-07-25 07:29:43.450553] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:11.113 [2024-07-25 07:29:43.450609] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:11.113 [2024-07-25 07:29:43.450690] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.046 "name": "raid_bdev1", 00:24:12.046 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:12.046 "strip_size_kb": 0, 00:24:12.046 "state": "online", 00:24:12.046 "raid_level": "raid1", 00:24:12.046 "superblock": true, 00:24:12.046 "num_base_bdevs": 2, 00:24:12.046 "num_base_bdevs_discovered": 2, 00:24:12.046 "num_base_bdevs_operational": 2, 00:24:12.046 "base_bdevs_list": [ 00:24:12.046 { 00:24:12.046 "name": "spare", 00:24:12.046 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:12.046 "is_configured": true, 00:24:12.046 "data_offset": 2048, 00:24:12.046 "data_size": 63488 00:24:12.046 }, 00:24:12.046 { 00:24:12.046 "name": "BaseBdev2", 00:24:12.046 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:12.046 "is_configured": true, 00:24:12.046 "data_offset": 2048, 00:24:12.046 "data_size": 63488 00:24:12.046 } 00:24:12.046 ] 00:24:12.046 }' 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:12.046 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.304 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.562 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.562 "name": "raid_bdev1", 00:24:12.562 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:12.562 "strip_size_kb": 0, 00:24:12.562 "state": "online", 00:24:12.562 "raid_level": "raid1", 00:24:12.563 "superblock": true, 00:24:12.563 "num_base_bdevs": 2, 00:24:12.563 "num_base_bdevs_discovered": 2, 00:24:12.563 "num_base_bdevs_operational": 2, 00:24:12.563 "base_bdevs_list": [ 00:24:12.563 { 00:24:12.563 "name": "spare", 00:24:12.563 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:12.563 "is_configured": true, 00:24:12.563 "data_offset": 2048, 00:24:12.563 "data_size": 63488 00:24:12.563 }, 00:24:12.563 { 00:24:12.563 "name": "BaseBdev2", 00:24:12.563 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:12.563 "is_configured": true, 00:24:12.563 "data_offset": 2048, 00:24:12.563 "data_size": 63488 00:24:12.563 } 00:24:12.563 ] 00:24:12.563 }' 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.563 07:29:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.821 07:29:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.821 "name": "raid_bdev1", 00:24:12.821 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:12.821 "strip_size_kb": 0, 00:24:12.821 "state": "online", 00:24:12.821 "raid_level": "raid1", 00:24:12.821 "superblock": true, 00:24:12.821 "num_base_bdevs": 2, 00:24:12.821 "num_base_bdevs_discovered": 2, 00:24:12.821 "num_base_bdevs_operational": 2, 00:24:12.821 "base_bdevs_list": [ 00:24:12.821 { 00:24:12.821 "name": "spare", 00:24:12.821 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:12.821 "is_configured": true, 00:24:12.821 "data_offset": 2048, 00:24:12.821 "data_size": 63488 00:24:12.821 }, 00:24:12.821 { 00:24:12.821 "name": "BaseBdev2", 00:24:12.821 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:12.821 "is_configured": true, 00:24:12.821 "data_offset": 2048, 00:24:12.821 "data_size": 63488 00:24:12.821 } 00:24:12.821 ] 00:24:12.821 }' 00:24:12.821 07:29:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.821 07:29:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:13.387 07:29:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:13.644 [2024-07-25 07:29:45.937342] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:13.644 [2024-07-25 07:29:45.937370] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:13.644 [2024-07-25 07:29:45.937425] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:13.644 [2024-07-25 07:29:45.937482] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:13.644 [2024-07-25 07:29:45.937494] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19756c0 name raid_bdev1, state offline 00:24:13.644 07:29:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:24:13.644 07:29:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:13.903 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:13.903 /dev/nbd0 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:14.161 1+0 records in 00:24:14.161 1+0 records out 00:24:14.161 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259246 s, 15.8 MB/s 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:14.161 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:14.161 /dev/nbd1 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:14.420 1+0 records in 00:24:14.420 1+0 records out 00:24:14.420 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290078 s, 14.1 MB/s 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:14.420 07:29:46 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:14.678 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:14.679 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:24:14.937 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:15.195 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:15.453 [2024-07-25 07:29:47.750237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:15.453 [2024-07-25 07:29:47.750285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.453 [2024-07-25 07:29:47.750306] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18cc390 00:24:15.453 [2024-07-25 07:29:47.750317] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.453 [2024-07-25 07:29:47.751836] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.453 [2024-07-25 07:29:47.751865] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:15.453 [2024-07-25 07:29:47.751943] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:15.453 [2024-07-25 07:29:47.751970] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:15.453 [2024-07-25 07:29:47.752061] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:15.453 spare 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.453 [2024-07-25 07:29:47.852369] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1977dd0 00:24:15.453 [2024-07-25 07:29:47.852386] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:15.453 [2024-07-25 07:29:47.852565] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1977190 00:24:15.453 [2024-07-25 07:29:47.852705] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1977dd0 00:24:15.453 [2024-07-25 07:29:47.852714] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1977dd0 00:24:15.453 [2024-07-25 07:29:47.852816] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.453 "name": "raid_bdev1", 00:24:15.453 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:15.453 "strip_size_kb": 0, 00:24:15.453 "state": "online", 00:24:15.453 "raid_level": "raid1", 00:24:15.453 "superblock": true, 00:24:15.453 "num_base_bdevs": 2, 00:24:15.453 "num_base_bdevs_discovered": 2, 00:24:15.453 "num_base_bdevs_operational": 2, 00:24:15.453 "base_bdevs_list": [ 00:24:15.453 { 00:24:15.453 "name": "spare", 00:24:15.453 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:15.453 "is_configured": true, 00:24:15.453 "data_offset": 2048, 00:24:15.453 "data_size": 63488 00:24:15.453 }, 00:24:15.453 { 00:24:15.453 "name": "BaseBdev2", 00:24:15.453 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:15.453 "is_configured": true, 00:24:15.453 "data_offset": 2048, 00:24:15.453 "data_size": 63488 00:24:15.453 } 00:24:15.453 ] 00:24:15.453 }' 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.453 07:29:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.019 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.277 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.277 "name": "raid_bdev1", 00:24:16.277 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:16.277 "strip_size_kb": 0, 00:24:16.277 "state": "online", 00:24:16.277 "raid_level": "raid1", 00:24:16.277 "superblock": true, 00:24:16.277 "num_base_bdevs": 2, 00:24:16.277 "num_base_bdevs_discovered": 2, 00:24:16.277 "num_base_bdevs_operational": 2, 00:24:16.277 "base_bdevs_list": [ 00:24:16.277 { 00:24:16.277 "name": "spare", 00:24:16.277 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:16.277 "is_configured": true, 00:24:16.277 "data_offset": 2048, 00:24:16.277 "data_size": 63488 00:24:16.277 }, 00:24:16.277 { 00:24:16.277 "name": "BaseBdev2", 00:24:16.277 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:16.277 "is_configured": true, 00:24:16.277 "data_offset": 2048, 00:24:16.277 "data_size": 63488 00:24:16.277 } 00:24:16.277 ] 00:24:16.277 }' 00:24:16.277 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.277 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:16.277 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:16.536 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:16.536 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.536 07:29:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:16.536 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:24:16.536 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:16.794 [2024-07-25 07:29:49.262315] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:16.794 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:16.794 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.794 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.794 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.794 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.795 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.053 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.053 "name": "raid_bdev1", 00:24:17.053 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:17.053 "strip_size_kb": 0, 00:24:17.053 "state": "online", 00:24:17.053 "raid_level": "raid1", 00:24:17.053 "superblock": true, 00:24:17.053 "num_base_bdevs": 2, 00:24:17.053 "num_base_bdevs_discovered": 1, 00:24:17.053 "num_base_bdevs_operational": 1, 00:24:17.053 "base_bdevs_list": [ 00:24:17.053 { 00:24:17.053 "name": null, 00:24:17.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.053 "is_configured": false, 00:24:17.053 "data_offset": 2048, 00:24:17.053 "data_size": 63488 00:24:17.053 }, 00:24:17.053 { 00:24:17.053 "name": "BaseBdev2", 00:24:17.053 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:17.053 "is_configured": true, 00:24:17.053 "data_offset": 2048, 00:24:17.053 "data_size": 63488 00:24:17.053 } 00:24:17.053 ] 00:24:17.053 }' 00:24:17.053 07:29:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.053 07:29:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:17.619 07:29:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:17.877 [2024-07-25 07:29:50.317118] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:17.877 [2024-07-25 07:29:50.317268] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:17.877 [2024-07-25 07:29:50.317284] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:17.877 [2024-07-25 07:29:50.317312] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:17.877 [2024-07-25 07:29:50.321977] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1977640 00:24:17.877 [2024-07-25 07:29:50.323229] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:17.877 07:29:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.252 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:19.252 "name": "raid_bdev1", 00:24:19.253 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:19.253 "strip_size_kb": 0, 00:24:19.253 "state": "online", 00:24:19.253 "raid_level": "raid1", 00:24:19.253 "superblock": true, 00:24:19.253 "num_base_bdevs": 2, 00:24:19.253 "num_base_bdevs_discovered": 2, 00:24:19.253 "num_base_bdevs_operational": 2, 00:24:19.253 "process": { 00:24:19.253 "type": "rebuild", 00:24:19.253 "target": "spare", 00:24:19.253 "progress": { 00:24:19.253 "blocks": 24576, 00:24:19.253 "percent": 38 00:24:19.253 } 00:24:19.253 }, 00:24:19.253 "base_bdevs_list": [ 00:24:19.253 { 00:24:19.253 "name": "spare", 00:24:19.253 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:19.253 "is_configured": true, 00:24:19.253 "data_offset": 2048, 00:24:19.253 "data_size": 63488 00:24:19.253 }, 00:24:19.253 { 00:24:19.253 "name": "BaseBdev2", 00:24:19.253 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:19.253 "is_configured": true, 00:24:19.253 "data_offset": 2048, 00:24:19.253 "data_size": 63488 00:24:19.253 } 00:24:19.253 ] 00:24:19.253 }' 00:24:19.253 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:19.253 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:19.253 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:19.253 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:19.253 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:19.511 [2024-07-25 07:29:51.883017] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:19.511 [2024-07-25 07:29:51.935004] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:19.511 [2024-07-25 07:29:51.935048] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:19.511 [2024-07-25 07:29:51.935061] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:19.511 [2024-07-25 07:29:51.935069] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.511 07:29:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.768 07:29:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.768 "name": "raid_bdev1", 00:24:19.768 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:19.768 "strip_size_kb": 0, 00:24:19.768 "state": "online", 00:24:19.768 "raid_level": "raid1", 00:24:19.768 "superblock": true, 00:24:19.768 "num_base_bdevs": 2, 00:24:19.768 "num_base_bdevs_discovered": 1, 00:24:19.768 "num_base_bdevs_operational": 1, 00:24:19.768 "base_bdevs_list": [ 00:24:19.768 { 00:24:19.768 "name": null, 00:24:19.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:19.768 "is_configured": false, 00:24:19.768 "data_offset": 2048, 00:24:19.768 "data_size": 63488 00:24:19.768 }, 00:24:19.768 { 00:24:19.768 "name": "BaseBdev2", 00:24:19.768 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:19.768 "is_configured": true, 00:24:19.768 "data_offset": 2048, 00:24:19.768 "data_size": 63488 00:24:19.768 } 00:24:19.768 ] 00:24:19.768 }' 00:24:19.768 07:29:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.768 07:29:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:20.334 07:29:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:20.591 [2024-07-25 07:29:52.977929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:20.591 [2024-07-25 07:29:52.977978] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.591 [2024-07-25 07:29:52.977998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18cd1f0 00:24:20.591 [2024-07-25 07:29:52.978009] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.591 [2024-07-25 07:29:52.978368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.591 [2024-07-25 07:29:52.978386] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:20.591 [2024-07-25 07:29:52.978458] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:20.591 [2024-07-25 07:29:52.978470] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:20.591 [2024-07-25 07:29:52.978480] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:20.591 [2024-07-25 07:29:52.978497] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.591 [2024-07-25 07:29:52.983176] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a875b0 00:24:20.591 spare 00:24:20.591 [2024-07-25 07:29:52.984427] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:20.591 07:29:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.561 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.819 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:21.819 "name": "raid_bdev1", 00:24:21.819 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:21.819 "strip_size_kb": 0, 00:24:21.819 "state": "online", 00:24:21.819 "raid_level": "raid1", 00:24:21.819 "superblock": true, 00:24:21.819 "num_base_bdevs": 2, 00:24:21.819 "num_base_bdevs_discovered": 2, 00:24:21.819 "num_base_bdevs_operational": 2, 00:24:21.819 "process": { 00:24:21.819 "type": "rebuild", 00:24:21.819 "target": "spare", 00:24:21.819 "progress": { 00:24:21.819 "blocks": 24576, 00:24:21.819 "percent": 38 00:24:21.819 } 00:24:21.819 }, 00:24:21.819 "base_bdevs_list": [ 00:24:21.819 { 00:24:21.819 "name": "spare", 00:24:21.819 "uuid": "ac76becf-1e71-59c2-8928-cf40490d03ba", 00:24:21.819 "is_configured": true, 00:24:21.819 "data_offset": 2048, 00:24:21.819 "data_size": 63488 00:24:21.819 }, 00:24:21.819 { 00:24:21.819 "name": "BaseBdev2", 00:24:21.819 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:21.819 "is_configured": true, 00:24:21.819 "data_offset": 2048, 00:24:21.819 "data_size": 63488 00:24:21.819 } 00:24:21.819 ] 00:24:21.819 }' 00:24:21.819 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:21.819 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:21.819 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:21.819 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:21.819 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:22.077 [2024-07-25 07:29:54.537215] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.077 [2024-07-25 07:29:54.596129] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:22.077 [2024-07-25 07:29:54.596177] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.077 [2024-07-25 07:29:54.596190] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.077 [2024-07-25 07:29:54.596198] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.334 "name": "raid_bdev1", 00:24:22.334 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:22.334 "strip_size_kb": 0, 00:24:22.334 "state": "online", 00:24:22.334 "raid_level": "raid1", 00:24:22.334 "superblock": true, 00:24:22.334 "num_base_bdevs": 2, 00:24:22.334 "num_base_bdevs_discovered": 1, 00:24:22.334 "num_base_bdevs_operational": 1, 00:24:22.334 "base_bdevs_list": [ 00:24:22.334 { 00:24:22.334 "name": null, 00:24:22.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.334 "is_configured": false, 00:24:22.334 "data_offset": 2048, 00:24:22.334 "data_size": 63488 00:24:22.334 }, 00:24:22.334 { 00:24:22.334 "name": "BaseBdev2", 00:24:22.334 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:22.334 "is_configured": true, 00:24:22.334 "data_offset": 2048, 00:24:22.334 "data_size": 63488 00:24:22.334 } 00:24:22.334 ] 00:24:22.334 }' 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.334 07:29:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:22.900 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:22.900 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:22.900 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:22.900 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:22.900 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:23.158 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.158 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.158 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:23.158 "name": "raid_bdev1", 00:24:23.158 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:23.158 "strip_size_kb": 0, 00:24:23.158 "state": "online", 00:24:23.158 "raid_level": "raid1", 00:24:23.158 "superblock": true, 00:24:23.158 "num_base_bdevs": 2, 00:24:23.158 "num_base_bdevs_discovered": 1, 00:24:23.158 "num_base_bdevs_operational": 1, 00:24:23.158 "base_bdevs_list": [ 00:24:23.158 { 00:24:23.158 "name": null, 00:24:23.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.158 "is_configured": false, 00:24:23.158 "data_offset": 2048, 00:24:23.158 "data_size": 63488 00:24:23.158 }, 00:24:23.158 { 00:24:23.158 "name": "BaseBdev2", 00:24:23.158 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:23.158 "is_configured": true, 00:24:23.158 "data_offset": 2048, 00:24:23.158 "data_size": 63488 00:24:23.158 } 00:24:23.158 ] 00:24:23.158 }' 00:24:23.158 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:23.416 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:23.416 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:23.416 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:23.416 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:23.675 07:29:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:23.675 [2024-07-25 07:29:56.188630] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:23.675 [2024-07-25 07:29:56.188676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:23.675 [2024-07-25 07:29:56.188696] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d7850 00:24:23.675 [2024-07-25 07:29:56.188707] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:23.675 [2024-07-25 07:29:56.189027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:23.675 [2024-07-25 07:29:56.189044] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:23.675 [2024-07-25 07:29:56.189104] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:23.675 [2024-07-25 07:29:56.189122] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:23.675 [2024-07-25 07:29:56.189131] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:23.675 BaseBdev1 00:24:23.675 07:29:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.049 "name": "raid_bdev1", 00:24:25.049 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:25.049 "strip_size_kb": 0, 00:24:25.049 "state": "online", 00:24:25.049 "raid_level": "raid1", 00:24:25.049 "superblock": true, 00:24:25.049 "num_base_bdevs": 2, 00:24:25.049 "num_base_bdevs_discovered": 1, 00:24:25.049 "num_base_bdevs_operational": 1, 00:24:25.049 "base_bdevs_list": [ 00:24:25.049 { 00:24:25.049 "name": null, 00:24:25.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.049 "is_configured": false, 00:24:25.049 "data_offset": 2048, 00:24:25.049 "data_size": 63488 00:24:25.049 }, 00:24:25.049 { 00:24:25.049 "name": "BaseBdev2", 00:24:25.049 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:25.049 "is_configured": true, 00:24:25.049 "data_offset": 2048, 00:24:25.049 "data_size": 63488 00:24:25.049 } 00:24:25.049 ] 00:24:25.049 }' 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.049 07:29:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.616 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.874 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.874 "name": "raid_bdev1", 00:24:25.874 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:25.874 "strip_size_kb": 0, 00:24:25.874 "state": "online", 00:24:25.874 "raid_level": "raid1", 00:24:25.874 "superblock": true, 00:24:25.874 "num_base_bdevs": 2, 00:24:25.874 "num_base_bdevs_discovered": 1, 00:24:25.874 "num_base_bdevs_operational": 1, 00:24:25.874 "base_bdevs_list": [ 00:24:25.874 { 00:24:25.874 "name": null, 00:24:25.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.874 "is_configured": false, 00:24:25.874 "data_offset": 2048, 00:24:25.874 "data_size": 63488 00:24:25.874 }, 00:24:25.874 { 00:24:25.874 "name": "BaseBdev2", 00:24:25.874 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:25.874 "is_configured": true, 00:24:25.874 "data_offset": 2048, 00:24:25.874 "data_size": 63488 00:24:25.874 } 00:24:25.874 ] 00:24:25.875 }' 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:25.875 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:26.133 [2024-07-25 07:29:58.563080] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:26.133 [2024-07-25 07:29:58.563205] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:26.133 [2024-07-25 07:29:58.563220] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:26.133 request: 00:24:26.133 { 00:24:26.133 "base_bdev": "BaseBdev1", 00:24:26.133 "raid_bdev": "raid_bdev1", 00:24:26.133 "method": "bdev_raid_add_base_bdev", 00:24:26.133 "req_id": 1 00:24:26.133 } 00:24:26.133 Got JSON-RPC error response 00:24:26.133 response: 00:24:26.133 { 00:24:26.133 "code": -22, 00:24:26.133 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:26.133 } 00:24:26.133 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:24:26.133 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:24:26.133 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:24:26.133 07:29:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:24:26.133 07:29:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.067 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.324 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:27.324 "name": "raid_bdev1", 00:24:27.324 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:27.324 "strip_size_kb": 0, 00:24:27.324 "state": "online", 00:24:27.324 "raid_level": "raid1", 00:24:27.324 "superblock": true, 00:24:27.324 "num_base_bdevs": 2, 00:24:27.324 "num_base_bdevs_discovered": 1, 00:24:27.324 "num_base_bdevs_operational": 1, 00:24:27.324 "base_bdevs_list": [ 00:24:27.324 { 00:24:27.324 "name": null, 00:24:27.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.324 "is_configured": false, 00:24:27.325 "data_offset": 2048, 00:24:27.325 "data_size": 63488 00:24:27.325 }, 00:24:27.325 { 00:24:27.325 "name": "BaseBdev2", 00:24:27.325 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:27.325 "is_configured": true, 00:24:27.325 "data_offset": 2048, 00:24:27.325 "data_size": 63488 00:24:27.325 } 00:24:27.325 ] 00:24:27.325 }' 00:24:27.325 07:29:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:27.325 07:29:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.890 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.147 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.147 "name": "raid_bdev1", 00:24:28.147 "uuid": "a5f3eea1-fd5b-48a3-9854-ca561a48e9bc", 00:24:28.147 "strip_size_kb": 0, 00:24:28.147 "state": "online", 00:24:28.147 "raid_level": "raid1", 00:24:28.147 "superblock": true, 00:24:28.147 "num_base_bdevs": 2, 00:24:28.147 "num_base_bdevs_discovered": 1, 00:24:28.147 "num_base_bdevs_operational": 1, 00:24:28.147 "base_bdevs_list": [ 00:24:28.147 { 00:24:28.147 "name": null, 00:24:28.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.147 "is_configured": false, 00:24:28.147 "data_offset": 2048, 00:24:28.147 "data_size": 63488 00:24:28.147 }, 00:24:28.147 { 00:24:28.147 "name": "BaseBdev2", 00:24:28.147 "uuid": "3e0ff27c-0d87-51d5-b809-1e3bd3179f2e", 00:24:28.147 "is_configured": true, 00:24:28.147 "data_offset": 2048, 00:24:28.147 "data_size": 63488 00:24:28.147 } 00:24:28.147 ] 00:24:28.147 }' 00:24:28.147 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.147 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:28.147 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1718117 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1718117 ']' 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1718117 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1718117 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1718117' 00:24:28.405 killing process with pid 1718117 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1718117 00:24:28.405 Received shutdown signal, test time was about 60.000000 seconds 00:24:28.405 00:24:28.405 Latency(us) 00:24:28.405 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:28.405 =================================================================================================================== 00:24:28.405 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:28.405 [2024-07-25 07:30:00.775224] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:28.405 [2024-07-25 07:30:00.775307] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:28.405 [2024-07-25 07:30:00.775346] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:28.405 [2024-07-25 07:30:00.775360] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1977dd0 name raid_bdev1, state offline 00:24:28.405 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1718117 00:24:28.405 [2024-07-25 07:30:00.800053] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:28.663 07:30:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:24:28.663 00:24:28.663 real 0m34.544s 00:24:28.663 user 0m50.093s 00:24:28.663 sys 0m6.382s 00:24:28.663 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:28.663 07:30:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:28.663 ************************************ 00:24:28.663 END TEST raid_rebuild_test_sb 00:24:28.663 ************************************ 00:24:28.663 07:30:01 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:28.663 07:30:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:28.663 07:30:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:28.663 07:30:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:28.663 ************************************ 00:24:28.663 START TEST raid_rebuild_test_io 00:24:28.663 ************************************ 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false true true 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:28.663 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1724454 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1724454 /var/tmp/spdk-raid.sock 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1724454 ']' 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:28.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:28.664 07:30:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:28.664 [2024-07-25 07:30:01.141489] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:24:28.664 [2024-07-25 07:30:01.141545] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1724454 ] 00:24:28.664 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:28.664 Zero copy mechanism will not be used. 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.0 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.1 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.2 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.3 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.4 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.5 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.6 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:01.7 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.0 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.1 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.2 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.3 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.4 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.5 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.6 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3d:02.7 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3f:01.0 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3f:01.1 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3f:01.2 cannot be used 00:24:28.921 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.921 EAL: Requested device 0000:3f:01.3 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:01.4 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:01.5 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:01.6 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:01.7 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.0 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.1 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.2 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.3 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.4 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.5 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.6 cannot be used 00:24:28.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:28.922 EAL: Requested device 0000:3f:02.7 cannot be used 00:24:28.922 [2024-07-25 07:30:01.273185] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:28.922 [2024-07-25 07:30:01.359805] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:24:28.922 [2024-07-25 07:30:01.417839] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:28.922 [2024-07-25 07:30:01.417877] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:29.856 07:30:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:29.856 07:30:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:24:29.856 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:29.856 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:29.856 BaseBdev1_malloc 00:24:29.856 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:30.114 [2024-07-25 07:30:02.483419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:30.114 [2024-07-25 07:30:02.483462] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.114 [2024-07-25 07:30:02.483482] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bfd690 00:24:30.114 [2024-07-25 07:30:02.483494] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.114 [2024-07-25 07:30:02.484963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.114 [2024-07-25 07:30:02.484990] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:30.114 BaseBdev1 00:24:30.114 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:30.114 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:30.372 BaseBdev2_malloc 00:24:30.372 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:30.630 [2024-07-25 07:30:02.940925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:30.630 [2024-07-25 07:30:02.940970] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.630 [2024-07-25 07:30:02.940991] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bfe050 00:24:30.630 [2024-07-25 07:30:02.941002] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.630 [2024-07-25 07:30:02.942347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.630 [2024-07-25 07:30:02.942375] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:30.630 BaseBdev2 00:24:30.630 07:30:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:30.888 spare_malloc 00:24:30.888 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:30.888 spare_delay 00:24:30.888 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:31.145 [2024-07-25 07:30:03.623088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:31.145 [2024-07-25 07:30:03.623130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:31.145 [2024-07-25 07:30:03.623154] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c9e810 00:24:31.145 [2024-07-25 07:30:03.623166] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:31.145 [2024-07-25 07:30:03.624531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:31.145 [2024-07-25 07:30:03.624559] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:31.145 spare 00:24:31.145 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:31.403 [2024-07-25 07:30:03.847696] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:31.403 [2024-07-25 07:30:03.848855] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:31.403 [2024-07-25 07:30:03.848935] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c9d6c0 00:24:31.403 [2024-07-25 07:30:03.848946] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:31.403 [2024-07-25 07:30:03.849129] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf5a70 00:24:31.403 [2024-07-25 07:30:03.849281] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c9d6c0 00:24:31.403 [2024-07-25 07:30:03.849290] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c9d6c0 00:24:31.403 [2024-07-25 07:30:03.849394] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.403 07:30:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.661 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.661 "name": "raid_bdev1", 00:24:31.661 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:31.661 "strip_size_kb": 0, 00:24:31.661 "state": "online", 00:24:31.661 "raid_level": "raid1", 00:24:31.661 "superblock": false, 00:24:31.661 "num_base_bdevs": 2, 00:24:31.661 "num_base_bdevs_discovered": 2, 00:24:31.661 "num_base_bdevs_operational": 2, 00:24:31.661 "base_bdevs_list": [ 00:24:31.661 { 00:24:31.661 "name": "BaseBdev1", 00:24:31.661 "uuid": "fd5cc22e-92ba-5e06-8172-25dd13a12291", 00:24:31.661 "is_configured": true, 00:24:31.661 "data_offset": 0, 00:24:31.661 "data_size": 65536 00:24:31.661 }, 00:24:31.661 { 00:24:31.661 "name": "BaseBdev2", 00:24:31.661 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:31.661 "is_configured": true, 00:24:31.661 "data_offset": 0, 00:24:31.661 "data_size": 65536 00:24:31.661 } 00:24:31.661 ] 00:24:31.661 }' 00:24:31.661 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.661 07:30:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:32.227 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:32.227 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:32.485 [2024-07-25 07:30:04.850551] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:32.485 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:24:32.485 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.485 07:30:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:32.742 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:24:32.742 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:24:32.742 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:32.742 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:32.742 [2024-07-25 07:30:05.205285] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca91f0 00:24:32.742 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:32.742 Zero copy mechanism will not be used. 00:24:32.742 Running I/O for 60 seconds... 00:24:33.000 [2024-07-25 07:30:05.314089] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:33.000 [2024-07-25 07:30:05.314269] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1ca91f0 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.000 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.258 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.258 "name": "raid_bdev1", 00:24:33.258 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:33.258 "strip_size_kb": 0, 00:24:33.258 "state": "online", 00:24:33.258 "raid_level": "raid1", 00:24:33.259 "superblock": false, 00:24:33.259 "num_base_bdevs": 2, 00:24:33.259 "num_base_bdevs_discovered": 1, 00:24:33.259 "num_base_bdevs_operational": 1, 00:24:33.259 "base_bdevs_list": [ 00:24:33.259 { 00:24:33.259 "name": null, 00:24:33.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.259 "is_configured": false, 00:24:33.259 "data_offset": 0, 00:24:33.259 "data_size": 65536 00:24:33.259 }, 00:24:33.259 { 00:24:33.259 "name": "BaseBdev2", 00:24:33.259 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:33.259 "is_configured": true, 00:24:33.259 "data_offset": 0, 00:24:33.259 "data_size": 65536 00:24:33.259 } 00:24:33.259 ] 00:24:33.259 }' 00:24:33.259 07:30:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.259 07:30:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:33.824 07:30:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:34.082 [2024-07-25 07:30:06.388882] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:34.082 [2024-07-25 07:30:06.428083] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dae000 00:24:34.082 [2024-07-25 07:30:06.430251] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:34.082 07:30:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:34.082 [2024-07-25 07:30:06.547284] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:34.082 [2024-07-25 07:30:06.547553] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:34.339 [2024-07-25 07:30:06.680030] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:34.339 [2024-07-25 07:30:06.680154] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:34.597 [2024-07-25 07:30:06.922606] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:34.904 [2024-07-25 07:30:07.163017] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:34.904 [2024-07-25 07:30:07.163158] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:35.172 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:35.172 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:35.172 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:35.173 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:35.173 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:35.173 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.173 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.173 [2024-07-25 07:30:07.514034] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:35.173 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:35.173 "name": "raid_bdev1", 00:24:35.173 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:35.173 "strip_size_kb": 0, 00:24:35.173 "state": "online", 00:24:35.173 "raid_level": "raid1", 00:24:35.173 "superblock": false, 00:24:35.173 "num_base_bdevs": 2, 00:24:35.173 "num_base_bdevs_discovered": 2, 00:24:35.173 "num_base_bdevs_operational": 2, 00:24:35.173 "process": { 00:24:35.173 "type": "rebuild", 00:24:35.173 "target": "spare", 00:24:35.173 "progress": { 00:24:35.173 "blocks": 14336, 00:24:35.173 "percent": 21 00:24:35.173 } 00:24:35.173 }, 00:24:35.173 "base_bdevs_list": [ 00:24:35.173 { 00:24:35.173 "name": "spare", 00:24:35.173 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:35.173 "is_configured": true, 00:24:35.173 "data_offset": 0, 00:24:35.173 "data_size": 65536 00:24:35.173 }, 00:24:35.173 { 00:24:35.173 "name": "BaseBdev2", 00:24:35.173 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:35.173 "is_configured": true, 00:24:35.173 "data_offset": 0, 00:24:35.173 "data_size": 65536 00:24:35.173 } 00:24:35.173 ] 00:24:35.173 }' 00:24:35.173 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.430 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:35.430 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.430 [2024-07-25 07:30:07.725785] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:35.430 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:35.430 07:30:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:35.689 [2024-07-25 07:30:07.973012] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.689 [2024-07-25 07:30:08.046271] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:35.689 [2024-07-25 07:30:08.046524] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:35.689 [2024-07-25 07:30:08.054537] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:35.689 [2024-07-25 07:30:08.056087] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.689 [2024-07-25 07:30:08.056112] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.689 [2024-07-25 07:30:08.056121] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:35.689 [2024-07-25 07:30:08.083981] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1ca91f0 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.689 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.947 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.947 "name": "raid_bdev1", 00:24:35.947 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:35.947 "strip_size_kb": 0, 00:24:35.947 "state": "online", 00:24:35.947 "raid_level": "raid1", 00:24:35.947 "superblock": false, 00:24:35.947 "num_base_bdevs": 2, 00:24:35.947 "num_base_bdevs_discovered": 1, 00:24:35.947 "num_base_bdevs_operational": 1, 00:24:35.947 "base_bdevs_list": [ 00:24:35.947 { 00:24:35.947 "name": null, 00:24:35.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.947 "is_configured": false, 00:24:35.947 "data_offset": 0, 00:24:35.947 "data_size": 65536 00:24:35.947 }, 00:24:35.947 { 00:24:35.947 "name": "BaseBdev2", 00:24:35.947 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:35.947 "is_configured": true, 00:24:35.947 "data_offset": 0, 00:24:35.947 "data_size": 65536 00:24:35.947 } 00:24:35.947 ] 00:24:35.947 }' 00:24:35.947 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.947 07:30:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.513 07:30:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.770 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.770 "name": "raid_bdev1", 00:24:36.770 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:36.770 "strip_size_kb": 0, 00:24:36.770 "state": "online", 00:24:36.770 "raid_level": "raid1", 00:24:36.770 "superblock": false, 00:24:36.770 "num_base_bdevs": 2, 00:24:36.770 "num_base_bdevs_discovered": 1, 00:24:36.770 "num_base_bdevs_operational": 1, 00:24:36.770 "base_bdevs_list": [ 00:24:36.770 { 00:24:36.770 "name": null, 00:24:36.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.770 "is_configured": false, 00:24:36.770 "data_offset": 0, 00:24:36.770 "data_size": 65536 00:24:36.770 }, 00:24:36.770 { 00:24:36.770 "name": "BaseBdev2", 00:24:36.770 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:36.770 "is_configured": true, 00:24:36.770 "data_offset": 0, 00:24:36.770 "data_size": 65536 00:24:36.770 } 00:24:36.770 ] 00:24:36.770 }' 00:24:36.770 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:36.770 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:36.770 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:36.770 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:36.770 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:37.028 [2024-07-25 07:30:09.493841] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.028 [2024-07-25 07:30:09.533174] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf5a90 00:24:37.028 [2024-07-25 07:30:09.534564] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:37.028 07:30:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:37.286 [2024-07-25 07:30:09.643794] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:37.286 [2024-07-25 07:30:09.644194] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:37.286 [2024-07-25 07:30:09.769603] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:37.286 [2024-07-25 07:30:09.769726] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.218 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.218 [2024-07-25 07:30:10.566445] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:38.218 [2024-07-25 07:30:10.566617] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.476 "name": "raid_bdev1", 00:24:38.476 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:38.476 "strip_size_kb": 0, 00:24:38.476 "state": "online", 00:24:38.476 "raid_level": "raid1", 00:24:38.476 "superblock": false, 00:24:38.476 "num_base_bdevs": 2, 00:24:38.476 "num_base_bdevs_discovered": 2, 00:24:38.476 "num_base_bdevs_operational": 2, 00:24:38.476 "process": { 00:24:38.476 "type": "rebuild", 00:24:38.476 "target": "spare", 00:24:38.476 "progress": { 00:24:38.476 "blocks": 16384, 00:24:38.476 "percent": 25 00:24:38.476 } 00:24:38.476 }, 00:24:38.476 "base_bdevs_list": [ 00:24:38.476 { 00:24:38.476 "name": "spare", 00:24:38.476 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:38.476 "is_configured": true, 00:24:38.476 "data_offset": 0, 00:24:38.476 "data_size": 65536 00:24:38.476 }, 00:24:38.476 { 00:24:38.476 "name": "BaseBdev2", 00:24:38.476 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:38.476 "is_configured": true, 00:24:38.476 "data_offset": 0, 00:24:38.476 "data_size": 65536 00:24:38.476 } 00:24:38.476 ] 00:24:38.476 }' 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=788 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.476 07:30:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.734 [2024-07-25 07:30:11.051561] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:38.734 [2024-07-25 07:30:11.051775] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:38.734 07:30:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.734 "name": "raid_bdev1", 00:24:38.734 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:38.734 "strip_size_kb": 0, 00:24:38.734 "state": "online", 00:24:38.734 "raid_level": "raid1", 00:24:38.734 "superblock": false, 00:24:38.734 "num_base_bdevs": 2, 00:24:38.734 "num_base_bdevs_discovered": 2, 00:24:38.734 "num_base_bdevs_operational": 2, 00:24:38.734 "process": { 00:24:38.734 "type": "rebuild", 00:24:38.734 "target": "spare", 00:24:38.734 "progress": { 00:24:38.734 "blocks": 22528, 00:24:38.734 "percent": 34 00:24:38.734 } 00:24:38.734 }, 00:24:38.734 "base_bdevs_list": [ 00:24:38.734 { 00:24:38.734 "name": "spare", 00:24:38.734 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:38.734 "is_configured": true, 00:24:38.734 "data_offset": 0, 00:24:38.734 "data_size": 65536 00:24:38.734 }, 00:24:38.734 { 00:24:38.734 "name": "BaseBdev2", 00:24:38.734 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:38.734 "is_configured": true, 00:24:38.734 "data_offset": 0, 00:24:38.734 "data_size": 65536 00:24:38.734 } 00:24:38.734 ] 00:24:38.734 }' 00:24:38.734 07:30:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.734 07:30:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.734 07:30:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.734 07:30:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.734 07:30:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:38.992 [2024-07-25 07:30:11.388914] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:39.249 [2024-07-25 07:30:11.607692] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.814 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.814 [2024-07-25 07:30:12.198253] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:40.072 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:40.072 "name": "raid_bdev1", 00:24:40.072 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:40.072 "strip_size_kb": 0, 00:24:40.072 "state": "online", 00:24:40.072 "raid_level": "raid1", 00:24:40.072 "superblock": false, 00:24:40.072 "num_base_bdevs": 2, 00:24:40.072 "num_base_bdevs_discovered": 2, 00:24:40.072 "num_base_bdevs_operational": 2, 00:24:40.072 "process": { 00:24:40.072 "type": "rebuild", 00:24:40.072 "target": "spare", 00:24:40.072 "progress": { 00:24:40.072 "blocks": 38912, 00:24:40.072 "percent": 59 00:24:40.072 } 00:24:40.072 }, 00:24:40.072 "base_bdevs_list": [ 00:24:40.072 { 00:24:40.072 "name": "spare", 00:24:40.072 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:40.072 "is_configured": true, 00:24:40.072 "data_offset": 0, 00:24:40.072 "data_size": 65536 00:24:40.072 }, 00:24:40.072 { 00:24:40.072 "name": "BaseBdev2", 00:24:40.072 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:40.072 "is_configured": true, 00:24:40.072 "data_offset": 0, 00:24:40.072 "data_size": 65536 00:24:40.072 } 00:24:40.072 ] 00:24:40.072 }' 00:24:40.072 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:40.072 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:40.072 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:40.072 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:40.072 07:30:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:40.638 [2024-07-25 07:30:12.994369] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:24:40.638 [2024-07-25 07:30:13.118388] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.203 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.203 [2024-07-25 07:30:13.584701] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:41.461 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.461 "name": "raid_bdev1", 00:24:41.461 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:41.461 "strip_size_kb": 0, 00:24:41.461 "state": "online", 00:24:41.461 "raid_level": "raid1", 00:24:41.461 "superblock": false, 00:24:41.461 "num_base_bdevs": 2, 00:24:41.461 "num_base_bdevs_discovered": 2, 00:24:41.461 "num_base_bdevs_operational": 2, 00:24:41.461 "process": { 00:24:41.461 "type": "rebuild", 00:24:41.461 "target": "spare", 00:24:41.461 "progress": { 00:24:41.461 "blocks": 59392, 00:24:41.461 "percent": 90 00:24:41.461 } 00:24:41.461 }, 00:24:41.461 "base_bdevs_list": [ 00:24:41.461 { 00:24:41.461 "name": "spare", 00:24:41.461 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:41.461 "is_configured": true, 00:24:41.461 "data_offset": 0, 00:24:41.461 "data_size": 65536 00:24:41.461 }, 00:24:41.461 { 00:24:41.461 "name": "BaseBdev2", 00:24:41.461 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:41.461 "is_configured": true, 00:24:41.461 "data_offset": 0, 00:24:41.461 "data_size": 65536 00:24:41.461 } 00:24:41.461 ] 00:24:41.461 }' 00:24:41.461 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.461 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:41.461 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.461 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:41.461 07:30:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:41.719 [2024-07-25 07:30:14.030582] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:41.719 [2024-07-25 07:30:14.130889] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:41.719 [2024-07-25 07:30:14.131938] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.654 07:30:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:42.654 "name": "raid_bdev1", 00:24:42.654 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:42.654 "strip_size_kb": 0, 00:24:42.654 "state": "online", 00:24:42.654 "raid_level": "raid1", 00:24:42.654 "superblock": false, 00:24:42.654 "num_base_bdevs": 2, 00:24:42.654 "num_base_bdevs_discovered": 2, 00:24:42.654 "num_base_bdevs_operational": 2, 00:24:42.654 "base_bdevs_list": [ 00:24:42.654 { 00:24:42.654 "name": "spare", 00:24:42.654 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:42.654 "is_configured": true, 00:24:42.654 "data_offset": 0, 00:24:42.654 "data_size": 65536 00:24:42.654 }, 00:24:42.654 { 00:24:42.654 "name": "BaseBdev2", 00:24:42.654 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:42.654 "is_configured": true, 00:24:42.654 "data_offset": 0, 00:24:42.654 "data_size": 65536 00:24:42.654 } 00:24:42.654 ] 00:24:42.654 }' 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.654 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.912 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:42.912 "name": "raid_bdev1", 00:24:42.912 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:42.912 "strip_size_kb": 0, 00:24:42.912 "state": "online", 00:24:42.912 "raid_level": "raid1", 00:24:42.912 "superblock": false, 00:24:42.912 "num_base_bdevs": 2, 00:24:42.912 "num_base_bdevs_discovered": 2, 00:24:42.912 "num_base_bdevs_operational": 2, 00:24:42.912 "base_bdevs_list": [ 00:24:42.912 { 00:24:42.912 "name": "spare", 00:24:42.912 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:42.913 "is_configured": true, 00:24:42.913 "data_offset": 0, 00:24:42.913 "data_size": 65536 00:24:42.913 }, 00:24:42.913 { 00:24:42.913 "name": "BaseBdev2", 00:24:42.913 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:42.913 "is_configured": true, 00:24:42.913 "data_offset": 0, 00:24:42.913 "data_size": 65536 00:24:42.913 } 00:24:42.913 ] 00:24:42.913 }' 00:24:42.913 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:42.913 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:42.913 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:43.171 "name": "raid_bdev1", 00:24:43.171 "uuid": "b3f29887-f991-4391-a111-7f0dd34d9686", 00:24:43.171 "strip_size_kb": 0, 00:24:43.171 "state": "online", 00:24:43.171 "raid_level": "raid1", 00:24:43.171 "superblock": false, 00:24:43.171 "num_base_bdevs": 2, 00:24:43.171 "num_base_bdevs_discovered": 2, 00:24:43.171 "num_base_bdevs_operational": 2, 00:24:43.171 "base_bdevs_list": [ 00:24:43.171 { 00:24:43.171 "name": "spare", 00:24:43.171 "uuid": "b46afa39-4803-5488-b0ae-3821e0658be1", 00:24:43.171 "is_configured": true, 00:24:43.171 "data_offset": 0, 00:24:43.171 "data_size": 65536 00:24:43.171 }, 00:24:43.171 { 00:24:43.171 "name": "BaseBdev2", 00:24:43.171 "uuid": "abfe58e3-1507-5e1f-b793-e2d63ea8fe39", 00:24:43.171 "is_configured": true, 00:24:43.171 "data_offset": 0, 00:24:43.171 "data_size": 65536 00:24:43.171 } 00:24:43.171 ] 00:24:43.171 }' 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:43.171 07:30:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:43.737 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:43.995 [2024-07-25 07:30:16.479900] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:43.995 [2024-07-25 07:30:16.479930] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:44.254 00:24:44.254 Latency(us) 00:24:44.254 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:44.254 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:44.254 raid_bdev1 : 11.30 98.89 296.68 0.00 0.00 14043.38 270.34 117440.51 00:24:44.254 =================================================================================================================== 00:24:44.254 Total : 98.89 296.68 0.00 0.00 14043.38 270.34 117440.51 00:24:44.254 [2024-07-25 07:30:16.543860] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.254 [2024-07-25 07:30:16.543885] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:44.254 [2024-07-25 07:30:16.543951] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:44.254 [2024-07-25 07:30:16.543962] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c9d6c0 name raid_bdev1, state offline 00:24:44.254 0 00:24:44.254 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.254 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:24:44.254 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:24:44.254 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:24:44.254 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:24:44.254 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:44.513 07:30:16 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:44.513 /dev/nbd0 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:44.513 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:44.771 1+0 records in 00:24:44.771 1+0 records out 00:24:44.772 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289607 s, 14.1 MB/s 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:44.772 /dev/nbd1 00:24:44.772 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:45.031 1+0 records in 00:24:45.031 1+0 records out 00:24:45.031 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300703 s, 13.6 MB/s 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:45.031 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:45.289 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1724454 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1724454 ']' 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1724454 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1724454 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1724454' 00:24:45.548 killing process with pid 1724454 00:24:45.548 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1724454 00:24:45.548 Received shutdown signal, test time was about 12.737372 seconds 00:24:45.548 00:24:45.548 Latency(us) 00:24:45.548 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:45.548 =================================================================================================================== 00:24:45.548 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:45.549 [2024-07-25 07:30:17.975882] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:45.549 07:30:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1724454 00:24:45.549 [2024-07-25 07:30:17.995213] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:24:45.807 00:24:45.807 real 0m17.115s 00:24:45.807 user 0m25.979s 00:24:45.807 sys 0m2.652s 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:45.807 ************************************ 00:24:45.807 END TEST raid_rebuild_test_io 00:24:45.807 ************************************ 00:24:45.807 07:30:18 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:24:45.807 07:30:18 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:45.807 07:30:18 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:45.807 07:30:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:45.807 ************************************ 00:24:45.807 START TEST raid_rebuild_test_sb_io 00:24:45.807 ************************************ 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true true true 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1727976 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1727976 /var/tmp/spdk-raid.sock 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1727976 ']' 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:45.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:45.807 07:30:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:46.067 [2024-07-25 07:30:18.349307] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:24:46.067 [2024-07-25 07:30:18.349365] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727976 ] 00:24:46.067 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:46.067 Zero copy mechanism will not be used. 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.0 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.1 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.2 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.3 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.4 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.5 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.6 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:01.7 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.0 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.1 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.2 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.3 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.4 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.5 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.6 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3d:02.7 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.067 EAL: Requested device 0000:3f:01.0 cannot be used 00:24:46.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.1 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.2 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.3 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.4 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.5 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.6 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:01.7 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.0 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.1 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.2 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.3 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.4 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.5 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.6 cannot be used 00:24:46.068 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:46.068 EAL: Requested device 0000:3f:02.7 cannot be used 00:24:46.068 [2024-07-25 07:30:18.480639] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:46.068 [2024-07-25 07:30:18.563236] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:24:46.326 [2024-07-25 07:30:18.624607] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:46.326 [2024-07-25 07:30:18.624646] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:46.893 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:46.893 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:24:46.893 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:46.893 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:47.152 BaseBdev1_malloc 00:24:47.152 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:47.410 [2024-07-25 07:30:19.688757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:47.410 [2024-07-25 07:30:19.688803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.410 [2024-07-25 07:30:19.688823] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cf690 00:24:47.410 [2024-07-25 07:30:19.688835] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.410 [2024-07-25 07:30:19.690290] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.410 [2024-07-25 07:30:19.690318] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:47.410 BaseBdev1 00:24:47.410 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:47.410 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:47.410 BaseBdev2_malloc 00:24:47.410 07:30:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:47.668 [2024-07-25 07:30:20.098114] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:47.668 [2024-07-25 07:30:20.098165] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.668 [2024-07-25 07:30:20.098192] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d0050 00:24:47.668 [2024-07-25 07:30:20.098204] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.668 [2024-07-25 07:30:20.099514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.668 [2024-07-25 07:30:20.099541] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:47.668 BaseBdev2 00:24:47.668 07:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:47.939 spare_malloc 00:24:47.939 07:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:48.234 spare_delay 00:24:48.234 07:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:48.505 [2024-07-25 07:30:20.783956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:48.505 [2024-07-25 07:30:20.783992] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:48.505 [2024-07-25 07:30:20.784008] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2470810 00:24:48.505 [2024-07-25 07:30:20.784020] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:48.505 [2024-07-25 07:30:20.785282] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:48.505 [2024-07-25 07:30:20.785308] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:48.505 spare 00:24:48.505 07:30:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:48.505 [2024-07-25 07:30:21.008570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:48.505 [2024-07-25 07:30:21.009615] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:48.505 [2024-07-25 07:30:21.009766] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x246f6c0 00:24:48.505 [2024-07-25 07:30:21.009779] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:48.505 [2024-07-25 07:30:21.009937] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d1780 00:24:48.505 [2024-07-25 07:30:21.010063] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x246f6c0 00:24:48.505 [2024-07-25 07:30:21.010073] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x246f6c0 00:24:48.505 [2024-07-25 07:30:21.010163] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.505 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.764 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.764 "name": "raid_bdev1", 00:24:48.764 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:48.764 "strip_size_kb": 0, 00:24:48.764 "state": "online", 00:24:48.764 "raid_level": "raid1", 00:24:48.764 "superblock": true, 00:24:48.764 "num_base_bdevs": 2, 00:24:48.764 "num_base_bdevs_discovered": 2, 00:24:48.764 "num_base_bdevs_operational": 2, 00:24:48.764 "base_bdevs_list": [ 00:24:48.764 { 00:24:48.764 "name": "BaseBdev1", 00:24:48.764 "uuid": "cdd3e9cc-7852-5feb-aed0-c59d6f346932", 00:24:48.764 "is_configured": true, 00:24:48.764 "data_offset": 2048, 00:24:48.764 "data_size": 63488 00:24:48.764 }, 00:24:48.764 { 00:24:48.764 "name": "BaseBdev2", 00:24:48.764 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:48.764 "is_configured": true, 00:24:48.764 "data_offset": 2048, 00:24:48.764 "data_size": 63488 00:24:48.764 } 00:24:48.764 ] 00:24:48.764 }' 00:24:48.764 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.764 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:49.331 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:49.331 07:30:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:49.590 [2024-07-25 07:30:22.055541] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:49.590 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:24:49.590 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:49.590 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.849 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:24:49.849 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:24:49.849 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:49.849 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:50.108 [2024-07-25 07:30:22.414241] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x247b940 00:24:50.108 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:50.108 Zero copy mechanism will not be used. 00:24:50.108 Running I/O for 60 seconds... 00:24:50.108 [2024-07-25 07:30:22.520533] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:50.108 [2024-07-25 07:30:22.528047] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247b940 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.108 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.367 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.367 "name": "raid_bdev1", 00:24:50.367 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:50.367 "strip_size_kb": 0, 00:24:50.367 "state": "online", 00:24:50.367 "raid_level": "raid1", 00:24:50.367 "superblock": true, 00:24:50.367 "num_base_bdevs": 2, 00:24:50.367 "num_base_bdevs_discovered": 1, 00:24:50.367 "num_base_bdevs_operational": 1, 00:24:50.367 "base_bdevs_list": [ 00:24:50.367 { 00:24:50.367 "name": null, 00:24:50.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.367 "is_configured": false, 00:24:50.367 "data_offset": 2048, 00:24:50.367 "data_size": 63488 00:24:50.367 }, 00:24:50.367 { 00:24:50.367 "name": "BaseBdev2", 00:24:50.367 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:50.367 "is_configured": true, 00:24:50.367 "data_offset": 2048, 00:24:50.367 "data_size": 63488 00:24:50.367 } 00:24:50.367 ] 00:24:50.367 }' 00:24:50.367 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.367 07:30:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:50.934 07:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:51.194 [2024-07-25 07:30:23.539887] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:51.194 [2024-07-25 07:30:23.593334] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20cdba0 00:24:51.194 [2024-07-25 07:30:23.595488] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:51.194 07:30:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:51.194 [2024-07-25 07:30:23.712518] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:51.194 [2024-07-25 07:30:23.712898] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:51.452 [2024-07-25 07:30:23.930746] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:51.452 [2024-07-25 07:30:23.930891] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:52.019 [2024-07-25 07:30:24.273173] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:52.019 [2024-07-25 07:30:24.491571] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.278 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.544 [2024-07-25 07:30:24.828329] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:52.544 [2024-07-25 07:30:24.828722] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:52.544 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:52.544 "name": "raid_bdev1", 00:24:52.544 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:52.544 "strip_size_kb": 0, 00:24:52.544 "state": "online", 00:24:52.544 "raid_level": "raid1", 00:24:52.544 "superblock": true, 00:24:52.544 "num_base_bdevs": 2, 00:24:52.544 "num_base_bdevs_discovered": 2, 00:24:52.544 "num_base_bdevs_operational": 2, 00:24:52.544 "process": { 00:24:52.544 "type": "rebuild", 00:24:52.544 "target": "spare", 00:24:52.544 "progress": { 00:24:52.544 "blocks": 12288, 00:24:52.544 "percent": 19 00:24:52.544 } 00:24:52.544 }, 00:24:52.544 "base_bdevs_list": [ 00:24:52.544 { 00:24:52.544 "name": "spare", 00:24:52.544 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:24:52.544 "is_configured": true, 00:24:52.544 "data_offset": 2048, 00:24:52.544 "data_size": 63488 00:24:52.545 }, 00:24:52.545 { 00:24:52.545 "name": "BaseBdev2", 00:24:52.545 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:52.545 "is_configured": true, 00:24:52.545 "data_offset": 2048, 00:24:52.545 "data_size": 63488 00:24:52.545 } 00:24:52.545 ] 00:24:52.545 }' 00:24:52.545 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:52.545 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:52.545 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:52.545 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:52.545 07:30:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:52.545 [2024-07-25 07:30:25.039050] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:52.806 [2024-07-25 07:30:25.128068] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:52.806 [2024-07-25 07:30:25.215979] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:52.806 [2024-07-25 07:30:25.217438] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:52.806 [2024-07-25 07:30:25.217463] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:52.806 [2024-07-25 07:30:25.217472] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:52.806 [2024-07-25 07:30:25.237748] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247b940 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.806 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.064 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.064 "name": "raid_bdev1", 00:24:53.064 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:53.064 "strip_size_kb": 0, 00:24:53.064 "state": "online", 00:24:53.064 "raid_level": "raid1", 00:24:53.064 "superblock": true, 00:24:53.064 "num_base_bdevs": 2, 00:24:53.064 "num_base_bdevs_discovered": 1, 00:24:53.064 "num_base_bdevs_operational": 1, 00:24:53.064 "base_bdevs_list": [ 00:24:53.064 { 00:24:53.064 "name": null, 00:24:53.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.064 "is_configured": false, 00:24:53.064 "data_offset": 2048, 00:24:53.064 "data_size": 63488 00:24:53.064 }, 00:24:53.064 { 00:24:53.064 "name": "BaseBdev2", 00:24:53.064 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:53.064 "is_configured": true, 00:24:53.064 "data_offset": 2048, 00:24:53.064 "data_size": 63488 00:24:53.064 } 00:24:53.064 ] 00:24:53.064 }' 00:24:53.064 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.064 07:30:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.630 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.888 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:53.888 "name": "raid_bdev1", 00:24:53.888 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:53.888 "strip_size_kb": 0, 00:24:53.888 "state": "online", 00:24:53.888 "raid_level": "raid1", 00:24:53.888 "superblock": true, 00:24:53.888 "num_base_bdevs": 2, 00:24:53.888 "num_base_bdevs_discovered": 1, 00:24:53.888 "num_base_bdevs_operational": 1, 00:24:53.888 "base_bdevs_list": [ 00:24:53.888 { 00:24:53.888 "name": null, 00:24:53.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.888 "is_configured": false, 00:24:53.888 "data_offset": 2048, 00:24:53.888 "data_size": 63488 00:24:53.888 }, 00:24:53.888 { 00:24:53.888 "name": "BaseBdev2", 00:24:53.888 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:53.888 "is_configured": true, 00:24:53.889 "data_offset": 2048, 00:24:53.889 "data_size": 63488 00:24:53.889 } 00:24:53.889 ] 00:24:53.889 }' 00:24:53.889 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:53.889 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:53.889 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:53.889 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:53.889 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:54.147 [2024-07-25 07:30:26.602770] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.147 07:30:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:54.147 [2024-07-25 07:30:26.656757] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d24e0 00:24:54.147 [2024-07-25 07:30:26.658132] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:54.406 [2024-07-25 07:30:26.774886] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:54.406 [2024-07-25 07:30:26.775161] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:54.406 [2024-07-25 07:30:26.893388] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:54.406 [2024-07-25 07:30:26.893519] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:54.664 [2024-07-25 07:30:27.135330] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:54.922 [2024-07-25 07:30:27.352424] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:54.922 [2024-07-25 07:30:27.352632] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.180 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.180 [2024-07-25 07:30:27.672273] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:55.180 [2024-07-25 07:30:27.672651] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:55.439 [2024-07-25 07:30:27.890761] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:55.439 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.439 "name": "raid_bdev1", 00:24:55.439 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:55.439 "strip_size_kb": 0, 00:24:55.439 "state": "online", 00:24:55.439 "raid_level": "raid1", 00:24:55.439 "superblock": true, 00:24:55.439 "num_base_bdevs": 2, 00:24:55.439 "num_base_bdevs_discovered": 2, 00:24:55.439 "num_base_bdevs_operational": 2, 00:24:55.439 "process": { 00:24:55.439 "type": "rebuild", 00:24:55.439 "target": "spare", 00:24:55.439 "progress": { 00:24:55.439 "blocks": 14336, 00:24:55.439 "percent": 22 00:24:55.439 } 00:24:55.439 }, 00:24:55.439 "base_bdevs_list": [ 00:24:55.439 { 00:24:55.439 "name": "spare", 00:24:55.439 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:24:55.439 "is_configured": true, 00:24:55.439 "data_offset": 2048, 00:24:55.439 "data_size": 63488 00:24:55.439 }, 00:24:55.439 { 00:24:55.439 "name": "BaseBdev2", 00:24:55.439 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:55.439 "is_configured": true, 00:24:55.439 "data_offset": 2048, 00:24:55.439 "data_size": 63488 00:24:55.439 } 00:24:55.439 ] 00:24:55.439 }' 00:24:55.439 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.439 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:55.439 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.698 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:24:55.699 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=805 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.699 07:30:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.699 [2024-07-25 07:30:28.132316] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:55.699 [2024-07-25 07:30:28.132623] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:55.699 07:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.699 "name": "raid_bdev1", 00:24:55.699 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:55.699 "strip_size_kb": 0, 00:24:55.699 "state": "online", 00:24:55.699 "raid_level": "raid1", 00:24:55.699 "superblock": true, 00:24:55.699 "num_base_bdevs": 2, 00:24:55.699 "num_base_bdevs_discovered": 2, 00:24:55.699 "num_base_bdevs_operational": 2, 00:24:55.699 "process": { 00:24:55.699 "type": "rebuild", 00:24:55.699 "target": "spare", 00:24:55.699 "progress": { 00:24:55.699 "blocks": 20480, 00:24:55.699 "percent": 32 00:24:55.699 } 00:24:55.699 }, 00:24:55.699 "base_bdevs_list": [ 00:24:55.699 { 00:24:55.699 "name": "spare", 00:24:55.699 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:24:55.699 "is_configured": true, 00:24:55.699 "data_offset": 2048, 00:24:55.699 "data_size": 63488 00:24:55.699 }, 00:24:55.699 { 00:24:55.699 "name": "BaseBdev2", 00:24:55.699 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:55.699 "is_configured": true, 00:24:55.699 "data_offset": 2048, 00:24:55.699 "data_size": 63488 00:24:55.699 } 00:24:55.699 ] 00:24:55.699 }' 00:24:55.699 07:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.957 07:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:55.957 07:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.957 07:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:55.957 07:30:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:55.957 [2024-07-25 07:30:28.343512] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:56.215 [2024-07-25 07:30:28.695380] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:56.473 [2024-07-25 07:30:28.820542] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:56.731 [2024-07-25 07:30:29.047920] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:56.990 [2024-07-25 07:30:29.272641] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:56.990 [2024-07-25 07:30:29.272762] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.990 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.990 [2024-07-25 07:30:29.517381] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:56.990 [2024-07-25 07:30:29.517657] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:57.248 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.248 "name": "raid_bdev1", 00:24:57.248 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:57.248 "strip_size_kb": 0, 00:24:57.248 "state": "online", 00:24:57.248 "raid_level": "raid1", 00:24:57.248 "superblock": true, 00:24:57.248 "num_base_bdevs": 2, 00:24:57.249 "num_base_bdevs_discovered": 2, 00:24:57.249 "num_base_bdevs_operational": 2, 00:24:57.249 "process": { 00:24:57.249 "type": "rebuild", 00:24:57.249 "target": "spare", 00:24:57.249 "progress": { 00:24:57.249 "blocks": 38912, 00:24:57.249 "percent": 61 00:24:57.249 } 00:24:57.249 }, 00:24:57.249 "base_bdevs_list": [ 00:24:57.249 { 00:24:57.249 "name": "spare", 00:24:57.249 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:24:57.249 "is_configured": true, 00:24:57.249 "data_offset": 2048, 00:24:57.249 "data_size": 63488 00:24:57.249 }, 00:24:57.249 { 00:24:57.249 "name": "BaseBdev2", 00:24:57.249 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:57.249 "is_configured": true, 00:24:57.249 "data_offset": 2048, 00:24:57.249 "data_size": 63488 00:24:57.249 } 00:24:57.249 ] 00:24:57.249 }' 00:24:57.249 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.249 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.249 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.249 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.249 07:30:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:57.249 [2024-07-25 07:30:29.743657] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:57.816 [2024-07-25 07:30:30.089853] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.383 [2024-07-25 07:30:30.768095] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:58.383 [2024-07-25 07:30:30.884767] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.383 "name": "raid_bdev1", 00:24:58.383 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:58.383 "strip_size_kb": 0, 00:24:58.383 "state": "online", 00:24:58.383 "raid_level": "raid1", 00:24:58.383 "superblock": true, 00:24:58.383 "num_base_bdevs": 2, 00:24:58.383 "num_base_bdevs_discovered": 2, 00:24:58.383 "num_base_bdevs_operational": 2, 00:24:58.383 "process": { 00:24:58.383 "type": "rebuild", 00:24:58.383 "target": "spare", 00:24:58.383 "progress": { 00:24:58.383 "blocks": 57344, 00:24:58.383 "percent": 90 00:24:58.383 } 00:24:58.383 }, 00:24:58.383 "base_bdevs_list": [ 00:24:58.383 { 00:24:58.383 "name": "spare", 00:24:58.383 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:24:58.383 "is_configured": true, 00:24:58.383 "data_offset": 2048, 00:24:58.383 "data_size": 63488 00:24:58.383 }, 00:24:58.383 { 00:24:58.383 "name": "BaseBdev2", 00:24:58.383 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:58.383 "is_configured": true, 00:24:58.383 "data_offset": 2048, 00:24:58.383 "data_size": 63488 00:24:58.383 } 00:24:58.383 ] 00:24:58.383 }' 00:24:58.383 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.641 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:58.641 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.641 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:58.641 07:30:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:58.900 [2024-07-25 07:30:31.203357] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:58.900 [2024-07-25 07:30:31.303604] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:58.900 [2024-07-25 07:30:31.304587] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.467 07:30:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.726 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.726 "name": "raid_bdev1", 00:24:59.726 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:24:59.726 "strip_size_kb": 0, 00:24:59.726 "state": "online", 00:24:59.726 "raid_level": "raid1", 00:24:59.726 "superblock": true, 00:24:59.726 "num_base_bdevs": 2, 00:24:59.726 "num_base_bdevs_discovered": 2, 00:24:59.726 "num_base_bdevs_operational": 2, 00:24:59.726 "base_bdevs_list": [ 00:24:59.726 { 00:24:59.726 "name": "spare", 00:24:59.726 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:24:59.726 "is_configured": true, 00:24:59.726 "data_offset": 2048, 00:24:59.726 "data_size": 63488 00:24:59.726 }, 00:24:59.726 { 00:24:59.726 "name": "BaseBdev2", 00:24:59.726 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:24:59.726 "is_configured": true, 00:24:59.726 "data_offset": 2048, 00:24:59.726 "data_size": 63488 00:24:59.726 } 00:24:59.726 ] 00:24:59.726 }' 00:24:59.726 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.984 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.241 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.241 "name": "raid_bdev1", 00:25:00.241 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:00.241 "strip_size_kb": 0, 00:25:00.241 "state": "online", 00:25:00.241 "raid_level": "raid1", 00:25:00.241 "superblock": true, 00:25:00.241 "num_base_bdevs": 2, 00:25:00.242 "num_base_bdevs_discovered": 2, 00:25:00.242 "num_base_bdevs_operational": 2, 00:25:00.242 "base_bdevs_list": [ 00:25:00.242 { 00:25:00.242 "name": "spare", 00:25:00.242 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:25:00.242 "is_configured": true, 00:25:00.242 "data_offset": 2048, 00:25:00.242 "data_size": 63488 00:25:00.242 }, 00:25:00.242 { 00:25:00.242 "name": "BaseBdev2", 00:25:00.242 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:00.242 "is_configured": true, 00:25:00.242 "data_offset": 2048, 00:25:00.242 "data_size": 63488 00:25:00.242 } 00:25:00.242 ] 00:25:00.242 }' 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.242 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.500 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:00.500 "name": "raid_bdev1", 00:25:00.500 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:00.500 "strip_size_kb": 0, 00:25:00.500 "state": "online", 00:25:00.500 "raid_level": "raid1", 00:25:00.500 "superblock": true, 00:25:00.500 "num_base_bdevs": 2, 00:25:00.500 "num_base_bdevs_discovered": 2, 00:25:00.500 "num_base_bdevs_operational": 2, 00:25:00.500 "base_bdevs_list": [ 00:25:00.500 { 00:25:00.500 "name": "spare", 00:25:00.500 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:25:00.500 "is_configured": true, 00:25:00.500 "data_offset": 2048, 00:25:00.500 "data_size": 63488 00:25:00.500 }, 00:25:00.500 { 00:25:00.500 "name": "BaseBdev2", 00:25:00.500 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:00.500 "is_configured": true, 00:25:00.500 "data_offset": 2048, 00:25:00.500 "data_size": 63488 00:25:00.500 } 00:25:00.500 ] 00:25:00.500 }' 00:25:00.500 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:00.500 07:30:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:01.074 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:01.074 [2024-07-25 07:30:33.578568] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:01.074 [2024-07-25 07:30:33.578598] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:01.074 00:25:01.074 Latency(us) 00:25:01.074 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:01.074 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:01.074 raid_bdev1 : 11.15 105.42 316.27 0.00 0.00 12204.11 273.61 116601.65 00:25:01.074 =================================================================================================================== 00:25:01.074 Total : 105.42 316.27 0.00 0.00 12204.11 273.61 116601.65 00:25:01.074 [2024-07-25 07:30:33.602239] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:01.075 [2024-07-25 07:30:33.602265] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:01.075 [2024-07-25 07:30:33.602331] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:01.075 [2024-07-25 07:30:33.602342] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x246f6c0 name raid_bdev1, state offline 00:25:01.075 0 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:01.357 07:30:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:01.629 /dev/nbd0 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:01.629 1+0 records in 00:25:01.629 1+0 records out 00:25:01.629 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264394 s, 15.5 MB/s 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:01.629 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:01.887 /dev/nbd1 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:01.887 1+0 records in 00:25:01.887 1+0 records out 00:25:01.887 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246751 s, 16.6 MB/s 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:01.887 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:02.146 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:02.403 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:02.404 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:25:02.661 07:30:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:02.919 [2024-07-25 07:30:35.376455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:02.919 [2024-07-25 07:30:35.376496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:02.919 [2024-07-25 07:30:35.376516] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c7e30 00:25:02.919 [2024-07-25 07:30:35.376528] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:02.919 [2024-07-25 07:30:35.378215] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:02.919 [2024-07-25 07:30:35.378244] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:02.919 [2024-07-25 07:30:35.378314] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:02.919 [2024-07-25 07:30:35.378341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:02.919 [2024-07-25 07:30:35.378437] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:02.919 spare 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.919 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.178 [2024-07-25 07:30:35.478746] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23f60b0 00:25:03.178 [2024-07-25 07:30:35.478760] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:03.178 [2024-07-25 07:30:35.478930] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2470720 00:25:03.178 [2024-07-25 07:30:35.479060] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23f60b0 00:25:03.178 [2024-07-25 07:30:35.479070] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23f60b0 00:25:03.178 [2024-07-25 07:30:35.479177] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:03.178 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.178 "name": "raid_bdev1", 00:25:03.178 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:03.178 "strip_size_kb": 0, 00:25:03.178 "state": "online", 00:25:03.178 "raid_level": "raid1", 00:25:03.178 "superblock": true, 00:25:03.178 "num_base_bdevs": 2, 00:25:03.178 "num_base_bdevs_discovered": 2, 00:25:03.178 "num_base_bdevs_operational": 2, 00:25:03.178 "base_bdevs_list": [ 00:25:03.178 { 00:25:03.178 "name": "spare", 00:25:03.178 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:25:03.178 "is_configured": true, 00:25:03.178 "data_offset": 2048, 00:25:03.178 "data_size": 63488 00:25:03.178 }, 00:25:03.178 { 00:25:03.178 "name": "BaseBdev2", 00:25:03.178 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:03.178 "is_configured": true, 00:25:03.178 "data_offset": 2048, 00:25:03.178 "data_size": 63488 00:25:03.178 } 00:25:03.178 ] 00:25:03.178 }' 00:25:03.178 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.178 07:30:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.744 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.002 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.002 "name": "raid_bdev1", 00:25:04.002 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:04.002 "strip_size_kb": 0, 00:25:04.002 "state": "online", 00:25:04.002 "raid_level": "raid1", 00:25:04.002 "superblock": true, 00:25:04.002 "num_base_bdevs": 2, 00:25:04.002 "num_base_bdevs_discovered": 2, 00:25:04.002 "num_base_bdevs_operational": 2, 00:25:04.002 "base_bdevs_list": [ 00:25:04.002 { 00:25:04.002 "name": "spare", 00:25:04.002 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:25:04.002 "is_configured": true, 00:25:04.002 "data_offset": 2048, 00:25:04.002 "data_size": 63488 00:25:04.002 }, 00:25:04.002 { 00:25:04.002 "name": "BaseBdev2", 00:25:04.002 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:04.002 "is_configured": true, 00:25:04.002 "data_offset": 2048, 00:25:04.002 "data_size": 63488 00:25:04.002 } 00:25:04.002 ] 00:25:04.002 }' 00:25:04.002 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.002 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:04.002 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.002 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:04.002 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.003 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:04.260 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:25:04.260 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:04.519 [2024-07-25 07:30:36.876776] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.519 07:30:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.777 07:30:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.777 "name": "raid_bdev1", 00:25:04.777 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:04.777 "strip_size_kb": 0, 00:25:04.777 "state": "online", 00:25:04.777 "raid_level": "raid1", 00:25:04.777 "superblock": true, 00:25:04.777 "num_base_bdevs": 2, 00:25:04.777 "num_base_bdevs_discovered": 1, 00:25:04.777 "num_base_bdevs_operational": 1, 00:25:04.777 "base_bdevs_list": [ 00:25:04.777 { 00:25:04.777 "name": null, 00:25:04.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.777 "is_configured": false, 00:25:04.777 "data_offset": 2048, 00:25:04.777 "data_size": 63488 00:25:04.777 }, 00:25:04.777 { 00:25:04.777 "name": "BaseBdev2", 00:25:04.777 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:04.777 "is_configured": true, 00:25:04.777 "data_offset": 2048, 00:25:04.777 "data_size": 63488 00:25:04.777 } 00:25:04.777 ] 00:25:04.777 }' 00:25:04.777 07:30:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.777 07:30:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:05.344 07:30:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:05.602 [2024-07-25 07:30:37.911638] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.602 [2024-07-25 07:30:37.911781] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:05.602 [2024-07-25 07:30:37.911797] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:05.602 [2024-07-25 07:30:37.911824] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.602 [2024-07-25 07:30:37.916865] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2470720 00:25:05.602 [2024-07-25 07:30:37.918962] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:05.602 07:30:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.536 07:30:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.795 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.795 "name": "raid_bdev1", 00:25:06.795 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:06.795 "strip_size_kb": 0, 00:25:06.795 "state": "online", 00:25:06.795 "raid_level": "raid1", 00:25:06.795 "superblock": true, 00:25:06.795 "num_base_bdevs": 2, 00:25:06.795 "num_base_bdevs_discovered": 2, 00:25:06.795 "num_base_bdevs_operational": 2, 00:25:06.795 "process": { 00:25:06.795 "type": "rebuild", 00:25:06.795 "target": "spare", 00:25:06.795 "progress": { 00:25:06.795 "blocks": 24576, 00:25:06.795 "percent": 38 00:25:06.795 } 00:25:06.795 }, 00:25:06.795 "base_bdevs_list": [ 00:25:06.795 { 00:25:06.795 "name": "spare", 00:25:06.795 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:25:06.795 "is_configured": true, 00:25:06.795 "data_offset": 2048, 00:25:06.795 "data_size": 63488 00:25:06.795 }, 00:25:06.795 { 00:25:06.795 "name": "BaseBdev2", 00:25:06.795 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:06.795 "is_configured": true, 00:25:06.795 "data_offset": 2048, 00:25:06.795 "data_size": 63488 00:25:06.795 } 00:25:06.795 ] 00:25:06.795 }' 00:25:06.795 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.795 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.795 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.795 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.795 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:07.054 [2024-07-25 07:30:39.474689] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.054 [2024-07-25 07:30:39.530790] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:07.054 [2024-07-25 07:30:39.530830] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.054 [2024-07-25 07:30:39.530844] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.054 [2024-07-25 07:30:39.530851] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.054 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.313 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.313 "name": "raid_bdev1", 00:25:07.313 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:07.313 "strip_size_kb": 0, 00:25:07.313 "state": "online", 00:25:07.313 "raid_level": "raid1", 00:25:07.313 "superblock": true, 00:25:07.313 "num_base_bdevs": 2, 00:25:07.313 "num_base_bdevs_discovered": 1, 00:25:07.313 "num_base_bdevs_operational": 1, 00:25:07.313 "base_bdevs_list": [ 00:25:07.313 { 00:25:07.313 "name": null, 00:25:07.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.313 "is_configured": false, 00:25:07.313 "data_offset": 2048, 00:25:07.313 "data_size": 63488 00:25:07.313 }, 00:25:07.313 { 00:25:07.313 "name": "BaseBdev2", 00:25:07.313 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:07.313 "is_configured": true, 00:25:07.313 "data_offset": 2048, 00:25:07.313 "data_size": 63488 00:25:07.313 } 00:25:07.313 ] 00:25:07.313 }' 00:25:07.313 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.313 07:30:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:07.880 07:30:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:08.138 [2024-07-25 07:30:40.578503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:08.138 [2024-07-25 07:30:40.578553] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.138 [2024-07-25 07:30:40.578573] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2470e80 00:25:08.138 [2024-07-25 07:30:40.578585] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.138 [2024-07-25 07:30:40.578937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.138 [2024-07-25 07:30:40.578954] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:08.138 [2024-07-25 07:30:40.579032] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:08.138 [2024-07-25 07:30:40.579043] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:08.138 [2024-07-25 07:30:40.579053] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:08.138 [2024-07-25 07:30:40.579070] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.138 [2024-07-25 07:30:40.584154] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20cdba0 00:25:08.138 spare 00:25:08.138 [2024-07-25 07:30:40.585520] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.138 07:30:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:25:09.072 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.072 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.072 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.072 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.072 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.331 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.331 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.331 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.331 "name": "raid_bdev1", 00:25:09.331 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:09.331 "strip_size_kb": 0, 00:25:09.331 "state": "online", 00:25:09.331 "raid_level": "raid1", 00:25:09.331 "superblock": true, 00:25:09.331 "num_base_bdevs": 2, 00:25:09.331 "num_base_bdevs_discovered": 2, 00:25:09.331 "num_base_bdevs_operational": 2, 00:25:09.331 "process": { 00:25:09.331 "type": "rebuild", 00:25:09.331 "target": "spare", 00:25:09.331 "progress": { 00:25:09.331 "blocks": 24576, 00:25:09.331 "percent": 38 00:25:09.331 } 00:25:09.331 }, 00:25:09.331 "base_bdevs_list": [ 00:25:09.331 { 00:25:09.331 "name": "spare", 00:25:09.331 "uuid": "1dcff495-62f4-5cf7-a42b-80d6a3f72e73", 00:25:09.331 "is_configured": true, 00:25:09.331 "data_offset": 2048, 00:25:09.331 "data_size": 63488 00:25:09.331 }, 00:25:09.331 { 00:25:09.331 "name": "BaseBdev2", 00:25:09.331 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:09.331 "is_configured": true, 00:25:09.331 "data_offset": 2048, 00:25:09.331 "data_size": 63488 00:25:09.331 } 00:25:09.331 ] 00:25:09.331 }' 00:25:09.331 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.589 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.589 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.589 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.589 07:30:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:09.847 [2024-07-25 07:30:42.133283] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:09.847 [2024-07-25 07:30:42.197415] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:09.847 [2024-07-25 07:30:42.197460] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:09.847 [2024-07-25 07:30:42.197474] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:09.847 [2024-07-25 07:30:42.197482] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.847 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.105 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.105 "name": "raid_bdev1", 00:25:10.105 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:10.105 "strip_size_kb": 0, 00:25:10.105 "state": "online", 00:25:10.105 "raid_level": "raid1", 00:25:10.105 "superblock": true, 00:25:10.105 "num_base_bdevs": 2, 00:25:10.105 "num_base_bdevs_discovered": 1, 00:25:10.105 "num_base_bdevs_operational": 1, 00:25:10.105 "base_bdevs_list": [ 00:25:10.105 { 00:25:10.105 "name": null, 00:25:10.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.105 "is_configured": false, 00:25:10.105 "data_offset": 2048, 00:25:10.105 "data_size": 63488 00:25:10.105 }, 00:25:10.105 { 00:25:10.105 "name": "BaseBdev2", 00:25:10.105 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:10.105 "is_configured": true, 00:25:10.105 "data_offset": 2048, 00:25:10.105 "data_size": 63488 00:25:10.105 } 00:25:10.105 ] 00:25:10.105 }' 00:25:10.105 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.105 07:30:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.672 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.930 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.930 "name": "raid_bdev1", 00:25:10.930 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:10.930 "strip_size_kb": 0, 00:25:10.930 "state": "online", 00:25:10.930 "raid_level": "raid1", 00:25:10.930 "superblock": true, 00:25:10.930 "num_base_bdevs": 2, 00:25:10.930 "num_base_bdevs_discovered": 1, 00:25:10.931 "num_base_bdevs_operational": 1, 00:25:10.931 "base_bdevs_list": [ 00:25:10.931 { 00:25:10.931 "name": null, 00:25:10.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.931 "is_configured": false, 00:25:10.931 "data_offset": 2048, 00:25:10.931 "data_size": 63488 00:25:10.931 }, 00:25:10.931 { 00:25:10.931 "name": "BaseBdev2", 00:25:10.931 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:10.931 "is_configured": true, 00:25:10.931 "data_offset": 2048, 00:25:10.931 "data_size": 63488 00:25:10.931 } 00:25:10.931 ] 00:25:10.931 }' 00:25:10.931 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.931 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:10.931 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.931 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:10.931 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:11.189 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:11.448 [2024-07-25 07:30:43.762085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:11.448 [2024-07-25 07:30:43.762135] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.448 [2024-07-25 07:30:43.762171] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x247ac90 00:25:11.448 [2024-07-25 07:30:43.762184] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.448 [2024-07-25 07:30:43.762519] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.448 [2024-07-25 07:30:43.762536] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:11.448 [2024-07-25 07:30:43.762598] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:11.448 [2024-07-25 07:30:43.762610] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:11.448 [2024-07-25 07:30:43.762619] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:11.448 BaseBdev1 00:25:11.448 07:30:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.383 07:30:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.641 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.642 "name": "raid_bdev1", 00:25:12.642 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:12.642 "strip_size_kb": 0, 00:25:12.642 "state": "online", 00:25:12.642 "raid_level": "raid1", 00:25:12.642 "superblock": true, 00:25:12.642 "num_base_bdevs": 2, 00:25:12.642 "num_base_bdevs_discovered": 1, 00:25:12.642 "num_base_bdevs_operational": 1, 00:25:12.642 "base_bdevs_list": [ 00:25:12.642 { 00:25:12.642 "name": null, 00:25:12.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.642 "is_configured": false, 00:25:12.642 "data_offset": 2048, 00:25:12.642 "data_size": 63488 00:25:12.642 }, 00:25:12.642 { 00:25:12.642 "name": "BaseBdev2", 00:25:12.642 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:12.642 "is_configured": true, 00:25:12.642 "data_offset": 2048, 00:25:12.642 "data_size": 63488 00:25:12.642 } 00:25:12.642 ] 00:25:12.642 }' 00:25:12.642 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.642 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.209 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.467 "name": "raid_bdev1", 00:25:13.467 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:13.467 "strip_size_kb": 0, 00:25:13.467 "state": "online", 00:25:13.467 "raid_level": "raid1", 00:25:13.467 "superblock": true, 00:25:13.467 "num_base_bdevs": 2, 00:25:13.467 "num_base_bdevs_discovered": 1, 00:25:13.467 "num_base_bdevs_operational": 1, 00:25:13.467 "base_bdevs_list": [ 00:25:13.467 { 00:25:13.467 "name": null, 00:25:13.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.467 "is_configured": false, 00:25:13.467 "data_offset": 2048, 00:25:13.467 "data_size": 63488 00:25:13.467 }, 00:25:13.467 { 00:25:13.467 "name": "BaseBdev2", 00:25:13.467 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:13.467 "is_configured": true, 00:25:13.467 "data_offset": 2048, 00:25:13.467 "data_size": 63488 00:25:13.467 } 00:25:13.467 ] 00:25:13.467 }' 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.467 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:13.468 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.468 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:13.468 07:30:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:13.726 [2024-07-25 07:30:46.120804] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:13.726 [2024-07-25 07:30:46.120922] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:13.726 [2024-07-25 07:30:46.120937] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:13.726 request: 00:25:13.726 { 00:25:13.726 "base_bdev": "BaseBdev1", 00:25:13.726 "raid_bdev": "raid_bdev1", 00:25:13.726 "method": "bdev_raid_add_base_bdev", 00:25:13.726 "req_id": 1 00:25:13.726 } 00:25:13.726 Got JSON-RPC error response 00:25:13.726 response: 00:25:13.726 { 00:25:13.726 "code": -22, 00:25:13.726 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:13.726 } 00:25:13.726 07:30:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:25:13.726 07:30:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:25:13.726 07:30:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:25:13.726 07:30:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:25:13.726 07:30:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:25:14.676 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.677 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.949 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.949 "name": "raid_bdev1", 00:25:14.949 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:14.949 "strip_size_kb": 0, 00:25:14.949 "state": "online", 00:25:14.949 "raid_level": "raid1", 00:25:14.949 "superblock": true, 00:25:14.949 "num_base_bdevs": 2, 00:25:14.949 "num_base_bdevs_discovered": 1, 00:25:14.949 "num_base_bdevs_operational": 1, 00:25:14.949 "base_bdevs_list": [ 00:25:14.949 { 00:25:14.949 "name": null, 00:25:14.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.949 "is_configured": false, 00:25:14.949 "data_offset": 2048, 00:25:14.949 "data_size": 63488 00:25:14.949 }, 00:25:14.949 { 00:25:14.949 "name": "BaseBdev2", 00:25:14.949 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:14.949 "is_configured": true, 00:25:14.949 "data_offset": 2048, 00:25:14.949 "data_size": 63488 00:25:14.949 } 00:25:14.949 ] 00:25:14.949 }' 00:25:14.949 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.949 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.515 07:30:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.777 "name": "raid_bdev1", 00:25:15.777 "uuid": "246cf965-6684-4930-8502-ddb06b9c4144", 00:25:15.777 "strip_size_kb": 0, 00:25:15.777 "state": "online", 00:25:15.777 "raid_level": "raid1", 00:25:15.777 "superblock": true, 00:25:15.777 "num_base_bdevs": 2, 00:25:15.777 "num_base_bdevs_discovered": 1, 00:25:15.777 "num_base_bdevs_operational": 1, 00:25:15.777 "base_bdevs_list": [ 00:25:15.777 { 00:25:15.777 "name": null, 00:25:15.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.777 "is_configured": false, 00:25:15.777 "data_offset": 2048, 00:25:15.777 "data_size": 63488 00:25:15.777 }, 00:25:15.777 { 00:25:15.777 "name": "BaseBdev2", 00:25:15.777 "uuid": "f6cfc5ef-b20c-5959-9de8-4780d0796d4d", 00:25:15.777 "is_configured": true, 00:25:15.777 "data_offset": 2048, 00:25:15.777 "data_size": 63488 00:25:15.777 } 00:25:15.777 ] 00:25:15.777 }' 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1727976 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1727976 ']' 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1727976 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:15.777 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1727976 00:25:16.036 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:16.036 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:16.037 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1727976' 00:25:16.037 killing process with pid 1727976 00:25:16.037 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1727976 00:25:16.037 Received shutdown signal, test time was about 25.837841 seconds 00:25:16.037 00:25:16.037 Latency(us) 00:25:16.037 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:16.037 =================================================================================================================== 00:25:16.037 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:16.037 [2024-07-25 07:30:48.317215] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:16.037 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1727976 00:25:16.037 [2024-07-25 07:30:48.317301] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:16.037 [2024-07-25 07:30:48.317344] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:16.037 [2024-07-25 07:30:48.317355] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f60b0 name raid_bdev1, state offline 00:25:16.037 [2024-07-25 07:30:48.335920] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:16.037 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:25:16.037 00:25:16.037 real 0m30.253s 00:25:16.037 user 0m46.837s 00:25:16.037 sys 0m4.416s 00:25:16.037 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:16.037 07:30:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:16.037 ************************************ 00:25:16.037 END TEST raid_rebuild_test_sb_io 00:25:16.037 ************************************ 00:25:16.295 07:30:48 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:25:16.295 07:30:48 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:25:16.295 07:30:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:16.295 07:30:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:16.295 07:30:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:16.295 ************************************ 00:25:16.295 START TEST raid_rebuild_test 00:25:16.295 ************************************ 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false false true 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1733474 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1733474 /var/tmp/spdk-raid.sock 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1733474 ']' 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:16.295 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:16.295 07:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:16.295 [2024-07-25 07:30:48.686539] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:25:16.295 [2024-07-25 07:30:48.686598] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1733474 ] 00:25:16.295 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:16.295 Zero copy mechanism will not be used. 00:25:16.295 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.295 EAL: Requested device 0000:3d:01.0 cannot be used 00:25:16.295 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.295 EAL: Requested device 0000:3d:01.1 cannot be used 00:25:16.295 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.295 EAL: Requested device 0000:3d:01.2 cannot be used 00:25:16.295 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.295 EAL: Requested device 0000:3d:01.3 cannot be used 00:25:16.295 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:01.4 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:01.5 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:01.6 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:01.7 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.0 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.1 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.2 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.3 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.4 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.5 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.6 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3d:02.7 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.0 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.1 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.2 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.3 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.4 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.5 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.6 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:01.7 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.0 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.1 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.2 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.3 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.4 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.5 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.6 cannot be used 00:25:16.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.296 EAL: Requested device 0000:3f:02.7 cannot be used 00:25:16.296 [2024-07-25 07:30:48.818474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:16.554 [2024-07-25 07:30:48.905226] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:25:16.554 [2024-07-25 07:30:48.964445] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:16.554 [2024-07-25 07:30:48.964481] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:17.120 07:30:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:17.120 07:30:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:25:17.120 07:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:17.120 07:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:17.379 BaseBdev1_malloc 00:25:17.379 07:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:17.637 [2024-07-25 07:30:50.016922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:17.638 [2024-07-25 07:30:50.016965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.638 [2024-07-25 07:30:50.016990] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2469690 00:25:17.638 [2024-07-25 07:30:50.017002] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.638 [2024-07-25 07:30:50.018548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.638 [2024-07-25 07:30:50.018575] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:17.638 BaseBdev1 00:25:17.638 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:17.638 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:17.896 BaseBdev2_malloc 00:25:17.896 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:18.155 [2024-07-25 07:30:50.478622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:18.155 [2024-07-25 07:30:50.478662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.155 [2024-07-25 07:30:50.478684] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246a050 00:25:18.155 [2024-07-25 07:30:50.478696] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.155 [2024-07-25 07:30:50.480033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.155 [2024-07-25 07:30:50.480060] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:18.155 BaseBdev2 00:25:18.155 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:18.155 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:18.413 BaseBdev3_malloc 00:25:18.413 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:18.413 [2024-07-25 07:30:50.936062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:18.413 [2024-07-25 07:30:50.936103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.413 [2024-07-25 07:30:50.936126] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2508280 00:25:18.413 [2024-07-25 07:30:50.936137] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.413 [2024-07-25 07:30:50.937472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.414 [2024-07-25 07:30:50.937497] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:18.414 BaseBdev3 00:25:18.672 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:18.672 07:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:18.672 BaseBdev4_malloc 00:25:18.672 07:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:18.931 [2024-07-25 07:30:51.381416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:18.931 [2024-07-25 07:30:51.381456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.931 [2024-07-25 07:30:51.381474] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x250b600 00:25:18.931 [2024-07-25 07:30:51.381485] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.931 [2024-07-25 07:30:51.382820] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.931 [2024-07-25 07:30:51.382846] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:18.931 BaseBdev4 00:25:18.931 07:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:19.189 spare_malloc 00:25:19.189 07:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:19.448 spare_delay 00:25:19.448 07:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:19.707 [2024-07-25 07:30:52.067557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:19.707 [2024-07-25 07:30:52.067597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.707 [2024-07-25 07:30:52.067615] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25154c0 00:25:19.707 [2024-07-25 07:30:52.067627] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.707 [2024-07-25 07:30:52.069001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.707 [2024-07-25 07:30:52.069028] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:19.707 spare 00:25:19.707 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:19.966 [2024-07-25 07:30:52.280132] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:19.966 [2024-07-25 07:30:52.281241] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:19.966 [2024-07-25 07:30:52.281300] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:19.966 [2024-07-25 07:30:52.281341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:19.966 [2024-07-25 07:30:52.281420] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x250c2f0 00:25:19.966 [2024-07-25 07:30:52.281429] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:19.966 [2024-07-25 07:30:52.281614] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x250c960 00:25:19.966 [2024-07-25 07:30:52.281756] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250c2f0 00:25:19.966 [2024-07-25 07:30:52.281769] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x250c2f0 00:25:19.966 [2024-07-25 07:30:52.281872] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.966 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.225 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.225 "name": "raid_bdev1", 00:25:20.225 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:20.225 "strip_size_kb": 0, 00:25:20.225 "state": "online", 00:25:20.225 "raid_level": "raid1", 00:25:20.225 "superblock": false, 00:25:20.225 "num_base_bdevs": 4, 00:25:20.225 "num_base_bdevs_discovered": 4, 00:25:20.225 "num_base_bdevs_operational": 4, 00:25:20.225 "base_bdevs_list": [ 00:25:20.225 { 00:25:20.225 "name": "BaseBdev1", 00:25:20.225 "uuid": "544cf2b6-fd24-5043-ac8f-5190756213e5", 00:25:20.225 "is_configured": true, 00:25:20.225 "data_offset": 0, 00:25:20.225 "data_size": 65536 00:25:20.225 }, 00:25:20.225 { 00:25:20.225 "name": "BaseBdev2", 00:25:20.225 "uuid": "834f8de9-8779-5689-a8ed-284b6b5656f3", 00:25:20.225 "is_configured": true, 00:25:20.225 "data_offset": 0, 00:25:20.225 "data_size": 65536 00:25:20.225 }, 00:25:20.225 { 00:25:20.225 "name": "BaseBdev3", 00:25:20.225 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:20.225 "is_configured": true, 00:25:20.225 "data_offset": 0, 00:25:20.225 "data_size": 65536 00:25:20.225 }, 00:25:20.225 { 00:25:20.225 "name": "BaseBdev4", 00:25:20.225 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:20.225 "is_configured": true, 00:25:20.225 "data_offset": 0, 00:25:20.225 "data_size": 65536 00:25:20.225 } 00:25:20.225 ] 00:25:20.225 }' 00:25:20.225 07:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.225 07:30:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:20.791 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:20.791 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:20.791 [2024-07-25 07:30:53.279031] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:20.791 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:25:20.791 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.791 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.050 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:21.307 [2024-07-25 07:30:53.727971] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24602e0 00:25:21.307 /dev/nbd0 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:21.307 1+0 records in 00:25:21.307 1+0 records out 00:25:21.307 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229919 s, 17.8 MB/s 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:25:21.307 07:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:27.870 65536+0 records in 00:25:27.870 65536+0 records out 00:25:27.870 33554432 bytes (34 MB, 32 MiB) copied, 6.21318 s, 5.4 MB/s 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:27.870 [2024-07-25 07:31:00.259011] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:27.870 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:28.129 [2024-07-25 07:31:00.463586] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.129 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:28.129 "name": "raid_bdev1", 00:25:28.129 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:28.129 "strip_size_kb": 0, 00:25:28.129 "state": "online", 00:25:28.129 "raid_level": "raid1", 00:25:28.129 "superblock": false, 00:25:28.129 "num_base_bdevs": 4, 00:25:28.129 "num_base_bdevs_discovered": 3, 00:25:28.129 "num_base_bdevs_operational": 3, 00:25:28.129 "base_bdevs_list": [ 00:25:28.129 { 00:25:28.129 "name": null, 00:25:28.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.129 "is_configured": false, 00:25:28.129 "data_offset": 0, 00:25:28.129 "data_size": 65536 00:25:28.129 }, 00:25:28.129 { 00:25:28.129 "name": "BaseBdev2", 00:25:28.129 "uuid": "834f8de9-8779-5689-a8ed-284b6b5656f3", 00:25:28.129 "is_configured": true, 00:25:28.129 "data_offset": 0, 00:25:28.129 "data_size": 65536 00:25:28.129 }, 00:25:28.129 { 00:25:28.129 "name": "BaseBdev3", 00:25:28.129 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:28.129 "is_configured": true, 00:25:28.129 "data_offset": 0, 00:25:28.129 "data_size": 65536 00:25:28.129 }, 00:25:28.129 { 00:25:28.129 "name": "BaseBdev4", 00:25:28.129 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:28.129 "is_configured": true, 00:25:28.129 "data_offset": 0, 00:25:28.129 "data_size": 65536 00:25:28.129 } 00:25:28.129 ] 00:25:28.129 }' 00:25:28.130 07:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:28.130 07:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:28.696 07:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:28.953 [2024-07-25 07:31:01.438174] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:28.953 [2024-07-25 07:31:01.442103] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2466e40 00:25:28.953 [2024-07-25 07:31:01.444264] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:28.953 07:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.326 "name": "raid_bdev1", 00:25:30.326 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:30.326 "strip_size_kb": 0, 00:25:30.326 "state": "online", 00:25:30.326 "raid_level": "raid1", 00:25:30.326 "superblock": false, 00:25:30.326 "num_base_bdevs": 4, 00:25:30.326 "num_base_bdevs_discovered": 4, 00:25:30.326 "num_base_bdevs_operational": 4, 00:25:30.326 "process": { 00:25:30.326 "type": "rebuild", 00:25:30.326 "target": "spare", 00:25:30.326 "progress": { 00:25:30.326 "blocks": 24576, 00:25:30.326 "percent": 37 00:25:30.326 } 00:25:30.326 }, 00:25:30.326 "base_bdevs_list": [ 00:25:30.326 { 00:25:30.326 "name": "spare", 00:25:30.326 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:30.326 "is_configured": true, 00:25:30.326 "data_offset": 0, 00:25:30.326 "data_size": 65536 00:25:30.326 }, 00:25:30.326 { 00:25:30.326 "name": "BaseBdev2", 00:25:30.326 "uuid": "834f8de9-8779-5689-a8ed-284b6b5656f3", 00:25:30.326 "is_configured": true, 00:25:30.326 "data_offset": 0, 00:25:30.326 "data_size": 65536 00:25:30.326 }, 00:25:30.326 { 00:25:30.326 "name": "BaseBdev3", 00:25:30.326 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:30.326 "is_configured": true, 00:25:30.326 "data_offset": 0, 00:25:30.326 "data_size": 65536 00:25:30.326 }, 00:25:30.326 { 00:25:30.326 "name": "BaseBdev4", 00:25:30.326 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:30.326 "is_configured": true, 00:25:30.326 "data_offset": 0, 00:25:30.326 "data_size": 65536 00:25:30.326 } 00:25:30.326 ] 00:25:30.326 }' 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:30.326 07:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:30.584 [2024-07-25 07:31:02.997282] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:30.584 [2024-07-25 07:31:03.055967] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:30.584 [2024-07-25 07:31:03.056011] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:30.584 [2024-07-25 07:31:03.056027] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:30.584 [2024-07-25 07:31:03.056034] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:30.584 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.585 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.585 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.585 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.585 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.585 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.843 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.843 "name": "raid_bdev1", 00:25:30.843 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:30.843 "strip_size_kb": 0, 00:25:30.843 "state": "online", 00:25:30.843 "raid_level": "raid1", 00:25:30.843 "superblock": false, 00:25:30.843 "num_base_bdevs": 4, 00:25:30.843 "num_base_bdevs_discovered": 3, 00:25:30.843 "num_base_bdevs_operational": 3, 00:25:30.843 "base_bdevs_list": [ 00:25:30.843 { 00:25:30.843 "name": null, 00:25:30.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.843 "is_configured": false, 00:25:30.843 "data_offset": 0, 00:25:30.843 "data_size": 65536 00:25:30.843 }, 00:25:30.843 { 00:25:30.843 "name": "BaseBdev2", 00:25:30.843 "uuid": "834f8de9-8779-5689-a8ed-284b6b5656f3", 00:25:30.843 "is_configured": true, 00:25:30.843 "data_offset": 0, 00:25:30.843 "data_size": 65536 00:25:30.843 }, 00:25:30.843 { 00:25:30.843 "name": "BaseBdev3", 00:25:30.843 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:30.843 "is_configured": true, 00:25:30.843 "data_offset": 0, 00:25:30.843 "data_size": 65536 00:25:30.843 }, 00:25:30.843 { 00:25:30.843 "name": "BaseBdev4", 00:25:30.843 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:30.843 "is_configured": true, 00:25:30.843 "data_offset": 0, 00:25:30.843 "data_size": 65536 00:25:30.843 } 00:25:30.843 ] 00:25:30.843 }' 00:25:30.843 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.843 07:31:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.407 07:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.665 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.665 "name": "raid_bdev1", 00:25:31.665 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:31.665 "strip_size_kb": 0, 00:25:31.665 "state": "online", 00:25:31.665 "raid_level": "raid1", 00:25:31.665 "superblock": false, 00:25:31.665 "num_base_bdevs": 4, 00:25:31.665 "num_base_bdevs_discovered": 3, 00:25:31.665 "num_base_bdevs_operational": 3, 00:25:31.665 "base_bdevs_list": [ 00:25:31.665 { 00:25:31.665 "name": null, 00:25:31.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.665 "is_configured": false, 00:25:31.665 "data_offset": 0, 00:25:31.665 "data_size": 65536 00:25:31.665 }, 00:25:31.665 { 00:25:31.665 "name": "BaseBdev2", 00:25:31.665 "uuid": "834f8de9-8779-5689-a8ed-284b6b5656f3", 00:25:31.665 "is_configured": true, 00:25:31.665 "data_offset": 0, 00:25:31.665 "data_size": 65536 00:25:31.665 }, 00:25:31.665 { 00:25:31.665 "name": "BaseBdev3", 00:25:31.665 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:31.665 "is_configured": true, 00:25:31.665 "data_offset": 0, 00:25:31.665 "data_size": 65536 00:25:31.665 }, 00:25:31.665 { 00:25:31.665 "name": "BaseBdev4", 00:25:31.665 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:31.665 "is_configured": true, 00:25:31.665 "data_offset": 0, 00:25:31.665 "data_size": 65536 00:25:31.665 } 00:25:31.665 ] 00:25:31.665 }' 00:25:31.665 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.665 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:31.665 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.665 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:31.665 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:31.963 [2024-07-25 07:31:04.383393] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:31.963 [2024-07-25 07:31:04.387288] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2466e40 00:25:31.963 [2024-07-25 07:31:04.388756] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:31.963 07:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.904 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.162 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.162 "name": "raid_bdev1", 00:25:33.162 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:33.162 "strip_size_kb": 0, 00:25:33.162 "state": "online", 00:25:33.162 "raid_level": "raid1", 00:25:33.162 "superblock": false, 00:25:33.162 "num_base_bdevs": 4, 00:25:33.162 "num_base_bdevs_discovered": 4, 00:25:33.162 "num_base_bdevs_operational": 4, 00:25:33.162 "process": { 00:25:33.162 "type": "rebuild", 00:25:33.162 "target": "spare", 00:25:33.162 "progress": { 00:25:33.162 "blocks": 24576, 00:25:33.162 "percent": 37 00:25:33.162 } 00:25:33.162 }, 00:25:33.162 "base_bdevs_list": [ 00:25:33.162 { 00:25:33.162 "name": "spare", 00:25:33.162 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:33.162 "is_configured": true, 00:25:33.162 "data_offset": 0, 00:25:33.162 "data_size": 65536 00:25:33.162 }, 00:25:33.162 { 00:25:33.162 "name": "BaseBdev2", 00:25:33.162 "uuid": "834f8de9-8779-5689-a8ed-284b6b5656f3", 00:25:33.162 "is_configured": true, 00:25:33.162 "data_offset": 0, 00:25:33.162 "data_size": 65536 00:25:33.162 }, 00:25:33.162 { 00:25:33.162 "name": "BaseBdev3", 00:25:33.162 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:33.162 "is_configured": true, 00:25:33.162 "data_offset": 0, 00:25:33.162 "data_size": 65536 00:25:33.162 }, 00:25:33.162 { 00:25:33.162 "name": "BaseBdev4", 00:25:33.162 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:33.162 "is_configured": true, 00:25:33.162 "data_offset": 0, 00:25:33.163 "data_size": 65536 00:25:33.163 } 00:25:33.163 ] 00:25:33.163 }' 00:25:33.163 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.163 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:33.163 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.420 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.420 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:25:33.420 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:25:33.420 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:33.420 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:25:33.420 07:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:33.420 [2024-07-25 07:31:05.930120] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:33.678 [2024-07-25 07:31:06.000442] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2466e40 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.678 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.936 "name": "raid_bdev1", 00:25:33.936 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:33.936 "strip_size_kb": 0, 00:25:33.936 "state": "online", 00:25:33.936 "raid_level": "raid1", 00:25:33.936 "superblock": false, 00:25:33.936 "num_base_bdevs": 4, 00:25:33.936 "num_base_bdevs_discovered": 3, 00:25:33.936 "num_base_bdevs_operational": 3, 00:25:33.936 "process": { 00:25:33.936 "type": "rebuild", 00:25:33.936 "target": "spare", 00:25:33.936 "progress": { 00:25:33.936 "blocks": 36864, 00:25:33.936 "percent": 56 00:25:33.936 } 00:25:33.936 }, 00:25:33.936 "base_bdevs_list": [ 00:25:33.936 { 00:25:33.936 "name": "spare", 00:25:33.936 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:33.936 "is_configured": true, 00:25:33.936 "data_offset": 0, 00:25:33.936 "data_size": 65536 00:25:33.936 }, 00:25:33.936 { 00:25:33.936 "name": null, 00:25:33.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.936 "is_configured": false, 00:25:33.936 "data_offset": 0, 00:25:33.936 "data_size": 65536 00:25:33.936 }, 00:25:33.936 { 00:25:33.936 "name": "BaseBdev3", 00:25:33.936 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:33.936 "is_configured": true, 00:25:33.936 "data_offset": 0, 00:25:33.936 "data_size": 65536 00:25:33.936 }, 00:25:33.936 { 00:25:33.936 "name": "BaseBdev4", 00:25:33.936 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:33.936 "is_configured": true, 00:25:33.936 "data_offset": 0, 00:25:33.936 "data_size": 65536 00:25:33.936 } 00:25:33.936 ] 00:25:33.936 }' 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=844 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.936 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.194 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.194 "name": "raid_bdev1", 00:25:34.194 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:34.194 "strip_size_kb": 0, 00:25:34.194 "state": "online", 00:25:34.194 "raid_level": "raid1", 00:25:34.194 "superblock": false, 00:25:34.194 "num_base_bdevs": 4, 00:25:34.194 "num_base_bdevs_discovered": 3, 00:25:34.194 "num_base_bdevs_operational": 3, 00:25:34.194 "process": { 00:25:34.194 "type": "rebuild", 00:25:34.194 "target": "spare", 00:25:34.194 "progress": { 00:25:34.194 "blocks": 43008, 00:25:34.194 "percent": 65 00:25:34.194 } 00:25:34.194 }, 00:25:34.194 "base_bdevs_list": [ 00:25:34.194 { 00:25:34.194 "name": "spare", 00:25:34.194 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:34.194 "is_configured": true, 00:25:34.194 "data_offset": 0, 00:25:34.194 "data_size": 65536 00:25:34.194 }, 00:25:34.194 { 00:25:34.194 "name": null, 00:25:34.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.194 "is_configured": false, 00:25:34.194 "data_offset": 0, 00:25:34.194 "data_size": 65536 00:25:34.194 }, 00:25:34.194 { 00:25:34.194 "name": "BaseBdev3", 00:25:34.194 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:34.194 "is_configured": true, 00:25:34.194 "data_offset": 0, 00:25:34.194 "data_size": 65536 00:25:34.194 }, 00:25:34.194 { 00:25:34.194 "name": "BaseBdev4", 00:25:34.194 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:34.194 "is_configured": true, 00:25:34.194 "data_offset": 0, 00:25:34.194 "data_size": 65536 00:25:34.194 } 00:25:34.194 ] 00:25:34.194 }' 00:25:34.194 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.194 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:34.194 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.194 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:34.194 07:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:35.128 [2024-07-25 07:31:07.612057] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:35.128 [2024-07-25 07:31:07.612110] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:35.128 [2024-07-25 07:31:07.612150] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.128 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.386 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.386 "name": "raid_bdev1", 00:25:35.386 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:35.386 "strip_size_kb": 0, 00:25:35.386 "state": "online", 00:25:35.386 "raid_level": "raid1", 00:25:35.386 "superblock": false, 00:25:35.386 "num_base_bdevs": 4, 00:25:35.386 "num_base_bdevs_discovered": 3, 00:25:35.386 "num_base_bdevs_operational": 3, 00:25:35.386 "base_bdevs_list": [ 00:25:35.386 { 00:25:35.386 "name": "spare", 00:25:35.386 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:35.386 "is_configured": true, 00:25:35.386 "data_offset": 0, 00:25:35.386 "data_size": 65536 00:25:35.386 }, 00:25:35.386 { 00:25:35.386 "name": null, 00:25:35.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.386 "is_configured": false, 00:25:35.386 "data_offset": 0, 00:25:35.386 "data_size": 65536 00:25:35.386 }, 00:25:35.386 { 00:25:35.386 "name": "BaseBdev3", 00:25:35.386 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:35.386 "is_configured": true, 00:25:35.386 "data_offset": 0, 00:25:35.386 "data_size": 65536 00:25:35.386 }, 00:25:35.386 { 00:25:35.386 "name": "BaseBdev4", 00:25:35.386 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:35.386 "is_configured": true, 00:25:35.386 "data_offset": 0, 00:25:35.386 "data_size": 65536 00:25:35.386 } 00:25:35.386 ] 00:25:35.386 }' 00:25:35.387 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.387 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:35.387 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.645 07:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.645 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.645 "name": "raid_bdev1", 00:25:35.645 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:35.645 "strip_size_kb": 0, 00:25:35.645 "state": "online", 00:25:35.645 "raid_level": "raid1", 00:25:35.645 "superblock": false, 00:25:35.645 "num_base_bdevs": 4, 00:25:35.645 "num_base_bdevs_discovered": 3, 00:25:35.645 "num_base_bdevs_operational": 3, 00:25:35.645 "base_bdevs_list": [ 00:25:35.645 { 00:25:35.645 "name": "spare", 00:25:35.645 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:35.645 "is_configured": true, 00:25:35.645 "data_offset": 0, 00:25:35.645 "data_size": 65536 00:25:35.645 }, 00:25:35.645 { 00:25:35.645 "name": null, 00:25:35.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.645 "is_configured": false, 00:25:35.645 "data_offset": 0, 00:25:35.645 "data_size": 65536 00:25:35.645 }, 00:25:35.645 { 00:25:35.645 "name": "BaseBdev3", 00:25:35.645 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:35.645 "is_configured": true, 00:25:35.645 "data_offset": 0, 00:25:35.645 "data_size": 65536 00:25:35.645 }, 00:25:35.645 { 00:25:35.645 "name": "BaseBdev4", 00:25:35.645 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:35.645 "is_configured": true, 00:25:35.645 "data_offset": 0, 00:25:35.645 "data_size": 65536 00:25:35.645 } 00:25:35.645 ] 00:25:35.645 }' 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.904 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.165 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.165 "name": "raid_bdev1", 00:25:36.165 "uuid": "fdd886aa-d3ba-499a-8cbf-2ca0e0143142", 00:25:36.165 "strip_size_kb": 0, 00:25:36.165 "state": "online", 00:25:36.165 "raid_level": "raid1", 00:25:36.165 "superblock": false, 00:25:36.165 "num_base_bdevs": 4, 00:25:36.165 "num_base_bdevs_discovered": 3, 00:25:36.165 "num_base_bdevs_operational": 3, 00:25:36.165 "base_bdevs_list": [ 00:25:36.165 { 00:25:36.166 "name": "spare", 00:25:36.166 "uuid": "7a264957-3adb-5cd6-a76e-e0eaf6e346aa", 00:25:36.166 "is_configured": true, 00:25:36.166 "data_offset": 0, 00:25:36.166 "data_size": 65536 00:25:36.166 }, 00:25:36.166 { 00:25:36.166 "name": null, 00:25:36.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.166 "is_configured": false, 00:25:36.166 "data_offset": 0, 00:25:36.166 "data_size": 65536 00:25:36.166 }, 00:25:36.166 { 00:25:36.166 "name": "BaseBdev3", 00:25:36.166 "uuid": "6546c0b9-4882-5de9-88b2-c58f9dbca3ff", 00:25:36.166 "is_configured": true, 00:25:36.166 "data_offset": 0, 00:25:36.166 "data_size": 65536 00:25:36.166 }, 00:25:36.166 { 00:25:36.166 "name": "BaseBdev4", 00:25:36.166 "uuid": "5988a7a3-82e6-5f8e-9572-064dd253bb2f", 00:25:36.166 "is_configured": true, 00:25:36.166 "data_offset": 0, 00:25:36.166 "data_size": 65536 00:25:36.166 } 00:25:36.166 ] 00:25:36.166 }' 00:25:36.166 07:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.166 07:31:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:36.734 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:36.734 [2024-07-25 07:31:09.260620] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:36.734 [2024-07-25 07:31:09.260645] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:36.734 [2024-07-25 07:31:09.260697] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:36.734 [2024-07-25 07:31:09.260767] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:36.734 [2024-07-25 07:31:09.260778] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250c2f0 name raid_bdev1, state offline 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:36.992 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:37.251 /dev/nbd0 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:37.251 1+0 records in 00:25:37.251 1+0 records out 00:25:37.251 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186529 s, 22.0 MB/s 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:37.251 07:31:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:37.509 /dev/nbd1 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:37.509 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:37.509 1+0 records in 00:25:37.509 1+0 records out 00:25:37.509 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000195542 s, 20.9 MB/s 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.768 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:38.026 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1733474 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1733474 ']' 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1733474 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1733474 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1733474' 00:25:38.285 killing process with pid 1733474 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1733474 00:25:38.285 Received shutdown signal, test time was about 60.000000 seconds 00:25:38.285 00:25:38.285 Latency(us) 00:25:38.285 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:38.285 =================================================================================================================== 00:25:38.285 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:38.285 [2024-07-25 07:31:10.695577] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:38.285 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1733474 00:25:38.285 [2024-07-25 07:31:10.735053] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:38.543 07:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:25:38.544 00:25:38.544 real 0m22.310s 00:25:38.544 user 0m30.549s 00:25:38.544 sys 0m4.466s 00:25:38.544 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:38.544 07:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:38.544 ************************************ 00:25:38.544 END TEST raid_rebuild_test 00:25:38.544 ************************************ 00:25:38.544 07:31:10 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:25:38.544 07:31:10 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:38.544 07:31:10 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:38.544 07:31:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:38.544 ************************************ 00:25:38.544 START TEST raid_rebuild_test_sb 00:25:38.544 ************************************ 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true false true 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1737476 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1737476 /var/tmp/spdk-raid.sock 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1737476 ']' 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:38.544 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:38.544 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:38.803 [2024-07-25 07:31:11.087057] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:25:38.803 [2024-07-25 07:31:11.087116] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1737476 ] 00:25:38.803 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:38.803 Zero copy mechanism will not be used. 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.0 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.1 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.2 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.3 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.4 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.5 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.6 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:01.7 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.0 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.1 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.2 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.3 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.4 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.5 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.6 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3d:02.7 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.0 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.1 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.2 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.3 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.4 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.5 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.6 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:01.7 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.0 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.1 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.2 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.3 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.4 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.5 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.6 cannot be used 00:25:38.803 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:38.803 EAL: Requested device 0000:3f:02.7 cannot be used 00:25:38.803 [2024-07-25 07:31:11.220313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:38.803 [2024-07-25 07:31:11.302967] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:25:39.062 [2024-07-25 07:31:11.363163] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:39.062 [2024-07-25 07:31:11.363197] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:39.629 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:39.629 07:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:25:39.629 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:39.629 07:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:39.629 BaseBdev1_malloc 00:25:39.887 07:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:39.887 [2024-07-25 07:31:12.371432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:39.887 [2024-07-25 07:31:12.371476] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:39.887 [2024-07-25 07:31:12.371497] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20a0690 00:25:39.887 [2024-07-25 07:31:12.371509] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:39.887 [2024-07-25 07:31:12.372901] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:39.887 [2024-07-25 07:31:12.372928] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:39.887 BaseBdev1 00:25:39.887 07:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:39.887 07:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:40.146 BaseBdev2_malloc 00:25:40.146 07:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:40.404 [2024-07-25 07:31:12.824922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:40.404 [2024-07-25 07:31:12.824964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.404 [2024-07-25 07:31:12.824984] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20a1050 00:25:40.404 [2024-07-25 07:31:12.824996] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.404 [2024-07-25 07:31:12.826245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.404 [2024-07-25 07:31:12.826272] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:40.404 BaseBdev2 00:25:40.404 07:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:40.404 07:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:40.663 BaseBdev3_malloc 00:25:40.663 07:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:40.921 [2024-07-25 07:31:13.286267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:40.921 [2024-07-25 07:31:13.286306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.921 [2024-07-25 07:31:13.286325] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213f280 00:25:40.921 [2024-07-25 07:31:13.286336] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.921 [2024-07-25 07:31:13.287570] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.921 [2024-07-25 07:31:13.287596] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:40.921 BaseBdev3 00:25:40.921 07:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:40.921 07:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:41.179 BaseBdev4_malloc 00:25:41.179 07:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:41.437 [2024-07-25 07:31:13.735500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:41.437 [2024-07-25 07:31:13.735536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:41.437 [2024-07-25 07:31:13.735553] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2142600 00:25:41.437 [2024-07-25 07:31:13.735564] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:41.437 [2024-07-25 07:31:13.736796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:41.437 [2024-07-25 07:31:13.736821] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:41.437 BaseBdev4 00:25:41.437 07:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:41.437 spare_malloc 00:25:41.695 07:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:41.695 spare_delay 00:25:41.695 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:41.954 [2024-07-25 07:31:14.413417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:41.954 [2024-07-25 07:31:14.413457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:41.954 [2024-07-25 07:31:14.413474] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x214c4c0 00:25:41.954 [2024-07-25 07:31:14.413486] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:41.954 [2024-07-25 07:31:14.414780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:41.954 [2024-07-25 07:31:14.414806] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:41.954 spare 00:25:41.954 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:42.212 [2024-07-25 07:31:14.638028] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:42.212 [2024-07-25 07:31:14.639074] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:42.212 [2024-07-25 07:31:14.639122] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:42.212 [2024-07-25 07:31:14.639171] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:42.212 [2024-07-25 07:31:14.639348] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21432f0 00:25:42.212 [2024-07-25 07:31:14.639359] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:42.212 [2024-07-25 07:31:14.639519] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20977b0 00:25:42.212 [2024-07-25 07:31:14.639651] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21432f0 00:25:42.212 [2024-07-25 07:31:14.639660] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21432f0 00:25:42.212 [2024-07-25 07:31:14.639744] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:42.212 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:42.213 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.213 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.471 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.471 "name": "raid_bdev1", 00:25:42.471 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:42.471 "strip_size_kb": 0, 00:25:42.471 "state": "online", 00:25:42.471 "raid_level": "raid1", 00:25:42.471 "superblock": true, 00:25:42.471 "num_base_bdevs": 4, 00:25:42.471 "num_base_bdevs_discovered": 4, 00:25:42.471 "num_base_bdevs_operational": 4, 00:25:42.471 "base_bdevs_list": [ 00:25:42.471 { 00:25:42.471 "name": "BaseBdev1", 00:25:42.471 "uuid": "dbbe7292-67fc-5ab2-b42a-243e88d38b55", 00:25:42.471 "is_configured": true, 00:25:42.471 "data_offset": 2048, 00:25:42.471 "data_size": 63488 00:25:42.471 }, 00:25:42.471 { 00:25:42.471 "name": "BaseBdev2", 00:25:42.471 "uuid": "8aa0e048-df3d-5101-8262-6056914fb358", 00:25:42.471 "is_configured": true, 00:25:42.471 "data_offset": 2048, 00:25:42.471 "data_size": 63488 00:25:42.471 }, 00:25:42.471 { 00:25:42.471 "name": "BaseBdev3", 00:25:42.471 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:42.471 "is_configured": true, 00:25:42.471 "data_offset": 2048, 00:25:42.471 "data_size": 63488 00:25:42.471 }, 00:25:42.471 { 00:25:42.471 "name": "BaseBdev4", 00:25:42.471 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:42.471 "is_configured": true, 00:25:42.471 "data_offset": 2048, 00:25:42.471 "data_size": 63488 00:25:42.472 } 00:25:42.472 ] 00:25:42.472 }' 00:25:42.472 07:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.472 07:31:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:43.039 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:43.039 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:43.333 [2024-07-25 07:31:15.604832] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:43.333 07:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:43.591 [2024-07-25 07:31:16.069808] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a3180 00:25:43.591 /dev/nbd0 00:25:43.591 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:43.591 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:43.591 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:43.591 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:43.592 1+0 records in 00:25:43.592 1+0 records out 00:25:43.592 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254325 s, 16.1 MB/s 00:25:43.592 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:25:43.850 07:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:50.412 63488+0 records in 00:25:50.412 63488+0 records out 00:25:50.412 32505856 bytes (33 MB, 31 MiB) copied, 6.16165 s, 5.3 MB/s 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:50.412 [2024-07-25 07:31:22.541801] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:50.412 [2024-07-25 07:31:22.762419] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.412 07:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.670 07:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:50.670 "name": "raid_bdev1", 00:25:50.670 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:50.670 "strip_size_kb": 0, 00:25:50.670 "state": "online", 00:25:50.670 "raid_level": "raid1", 00:25:50.670 "superblock": true, 00:25:50.670 "num_base_bdevs": 4, 00:25:50.670 "num_base_bdevs_discovered": 3, 00:25:50.670 "num_base_bdevs_operational": 3, 00:25:50.670 "base_bdevs_list": [ 00:25:50.670 { 00:25:50.670 "name": null, 00:25:50.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.670 "is_configured": false, 00:25:50.670 "data_offset": 2048, 00:25:50.670 "data_size": 63488 00:25:50.670 }, 00:25:50.670 { 00:25:50.670 "name": "BaseBdev2", 00:25:50.670 "uuid": "8aa0e048-df3d-5101-8262-6056914fb358", 00:25:50.670 "is_configured": true, 00:25:50.670 "data_offset": 2048, 00:25:50.670 "data_size": 63488 00:25:50.670 }, 00:25:50.670 { 00:25:50.670 "name": "BaseBdev3", 00:25:50.670 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:50.670 "is_configured": true, 00:25:50.670 "data_offset": 2048, 00:25:50.670 "data_size": 63488 00:25:50.670 }, 00:25:50.670 { 00:25:50.670 "name": "BaseBdev4", 00:25:50.670 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:50.670 "is_configured": true, 00:25:50.670 "data_offset": 2048, 00:25:50.670 "data_size": 63488 00:25:50.670 } 00:25:50.670 ] 00:25:50.670 }' 00:25:50.670 07:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:50.670 07:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:51.238 07:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:51.238 [2024-07-25 07:31:23.769072] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:51.496 [2024-07-25 07:31:23.772981] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a35d0 00:25:51.496 [2024-07-25 07:31:23.775134] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:51.496 07:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.430 07:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.688 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:52.688 "name": "raid_bdev1", 00:25:52.688 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:52.688 "strip_size_kb": 0, 00:25:52.688 "state": "online", 00:25:52.688 "raid_level": "raid1", 00:25:52.688 "superblock": true, 00:25:52.688 "num_base_bdevs": 4, 00:25:52.688 "num_base_bdevs_discovered": 4, 00:25:52.688 "num_base_bdevs_operational": 4, 00:25:52.688 "process": { 00:25:52.688 "type": "rebuild", 00:25:52.688 "target": "spare", 00:25:52.688 "progress": { 00:25:52.688 "blocks": 24576, 00:25:52.688 "percent": 38 00:25:52.688 } 00:25:52.688 }, 00:25:52.688 "base_bdevs_list": [ 00:25:52.688 { 00:25:52.688 "name": "spare", 00:25:52.688 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:52.688 "is_configured": true, 00:25:52.688 "data_offset": 2048, 00:25:52.688 "data_size": 63488 00:25:52.688 }, 00:25:52.688 { 00:25:52.688 "name": "BaseBdev2", 00:25:52.688 "uuid": "8aa0e048-df3d-5101-8262-6056914fb358", 00:25:52.688 "is_configured": true, 00:25:52.688 "data_offset": 2048, 00:25:52.688 "data_size": 63488 00:25:52.688 }, 00:25:52.688 { 00:25:52.688 "name": "BaseBdev3", 00:25:52.688 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:52.688 "is_configured": true, 00:25:52.688 "data_offset": 2048, 00:25:52.688 "data_size": 63488 00:25:52.688 }, 00:25:52.688 { 00:25:52.688 "name": "BaseBdev4", 00:25:52.688 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:52.688 "is_configured": true, 00:25:52.688 "data_offset": 2048, 00:25:52.688 "data_size": 63488 00:25:52.688 } 00:25:52.688 ] 00:25:52.688 }' 00:25:52.688 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:52.688 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:52.688 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.688 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:52.688 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:52.947 [2024-07-25 07:31:25.324165] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:52.947 [2024-07-25 07:31:25.386898] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:52.947 [2024-07-25 07:31:25.386941] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:52.947 [2024-07-25 07:31:25.386957] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:52.947 [2024-07-25 07:31:25.386964] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.947 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.206 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.206 "name": "raid_bdev1", 00:25:53.206 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:53.206 "strip_size_kb": 0, 00:25:53.206 "state": "online", 00:25:53.206 "raid_level": "raid1", 00:25:53.206 "superblock": true, 00:25:53.206 "num_base_bdevs": 4, 00:25:53.206 "num_base_bdevs_discovered": 3, 00:25:53.206 "num_base_bdevs_operational": 3, 00:25:53.206 "base_bdevs_list": [ 00:25:53.206 { 00:25:53.206 "name": null, 00:25:53.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.206 "is_configured": false, 00:25:53.206 "data_offset": 2048, 00:25:53.206 "data_size": 63488 00:25:53.206 }, 00:25:53.206 { 00:25:53.206 "name": "BaseBdev2", 00:25:53.206 "uuid": "8aa0e048-df3d-5101-8262-6056914fb358", 00:25:53.206 "is_configured": true, 00:25:53.206 "data_offset": 2048, 00:25:53.206 "data_size": 63488 00:25:53.206 }, 00:25:53.206 { 00:25:53.206 "name": "BaseBdev3", 00:25:53.206 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:53.206 "is_configured": true, 00:25:53.206 "data_offset": 2048, 00:25:53.206 "data_size": 63488 00:25:53.206 }, 00:25:53.206 { 00:25:53.206 "name": "BaseBdev4", 00:25:53.206 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:53.206 "is_configured": true, 00:25:53.206 "data_offset": 2048, 00:25:53.206 "data_size": 63488 00:25:53.206 } 00:25:53.206 ] 00:25:53.206 }' 00:25:53.206 07:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.206 07:31:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.773 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.031 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.031 "name": "raid_bdev1", 00:25:54.031 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:54.031 "strip_size_kb": 0, 00:25:54.031 "state": "online", 00:25:54.031 "raid_level": "raid1", 00:25:54.031 "superblock": true, 00:25:54.031 "num_base_bdevs": 4, 00:25:54.031 "num_base_bdevs_discovered": 3, 00:25:54.031 "num_base_bdevs_operational": 3, 00:25:54.031 "base_bdevs_list": [ 00:25:54.031 { 00:25:54.031 "name": null, 00:25:54.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.031 "is_configured": false, 00:25:54.031 "data_offset": 2048, 00:25:54.031 "data_size": 63488 00:25:54.031 }, 00:25:54.031 { 00:25:54.031 "name": "BaseBdev2", 00:25:54.031 "uuid": "8aa0e048-df3d-5101-8262-6056914fb358", 00:25:54.031 "is_configured": true, 00:25:54.031 "data_offset": 2048, 00:25:54.031 "data_size": 63488 00:25:54.031 }, 00:25:54.031 { 00:25:54.031 "name": "BaseBdev3", 00:25:54.031 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:54.031 "is_configured": true, 00:25:54.031 "data_offset": 2048, 00:25:54.031 "data_size": 63488 00:25:54.031 }, 00:25:54.031 { 00:25:54.031 "name": "BaseBdev4", 00:25:54.031 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:54.031 "is_configured": true, 00:25:54.031 "data_offset": 2048, 00:25:54.031 "data_size": 63488 00:25:54.031 } 00:25:54.031 ] 00:25:54.031 }' 00:25:54.031 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.031 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:54.031 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.031 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:54.031 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:54.290 [2024-07-25 07:31:26.758394] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:54.290 [2024-07-25 07:31:26.762293] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21409e0 00:25:54.290 [2024-07-25 07:31:26.763765] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:54.290 07:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.666 07:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.666 "name": "raid_bdev1", 00:25:55.666 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:55.666 "strip_size_kb": 0, 00:25:55.666 "state": "online", 00:25:55.666 "raid_level": "raid1", 00:25:55.666 "superblock": true, 00:25:55.666 "num_base_bdevs": 4, 00:25:55.666 "num_base_bdevs_discovered": 4, 00:25:55.666 "num_base_bdevs_operational": 4, 00:25:55.666 "process": { 00:25:55.666 "type": "rebuild", 00:25:55.666 "target": "spare", 00:25:55.666 "progress": { 00:25:55.666 "blocks": 24576, 00:25:55.666 "percent": 38 00:25:55.666 } 00:25:55.666 }, 00:25:55.666 "base_bdevs_list": [ 00:25:55.666 { 00:25:55.666 "name": "spare", 00:25:55.666 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:55.666 "is_configured": true, 00:25:55.666 "data_offset": 2048, 00:25:55.666 "data_size": 63488 00:25:55.666 }, 00:25:55.666 { 00:25:55.666 "name": "BaseBdev2", 00:25:55.666 "uuid": "8aa0e048-df3d-5101-8262-6056914fb358", 00:25:55.666 "is_configured": true, 00:25:55.666 "data_offset": 2048, 00:25:55.666 "data_size": 63488 00:25:55.666 }, 00:25:55.666 { 00:25:55.666 "name": "BaseBdev3", 00:25:55.666 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:55.666 "is_configured": true, 00:25:55.666 "data_offset": 2048, 00:25:55.666 "data_size": 63488 00:25:55.666 }, 00:25:55.666 { 00:25:55.666 "name": "BaseBdev4", 00:25:55.666 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:55.666 "is_configured": true, 00:25:55.666 "data_offset": 2048, 00:25:55.666 "data_size": 63488 00:25:55.666 } 00:25:55.666 ] 00:25:55.666 }' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:25:55.666 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:25:55.666 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:55.923 [2024-07-25 07:31:28.304772] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:56.181 [2024-07-25 07:31:28.475777] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x21409e0 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.181 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.440 "name": "raid_bdev1", 00:25:56.440 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:56.440 "strip_size_kb": 0, 00:25:56.440 "state": "online", 00:25:56.440 "raid_level": "raid1", 00:25:56.440 "superblock": true, 00:25:56.440 "num_base_bdevs": 4, 00:25:56.440 "num_base_bdevs_discovered": 3, 00:25:56.440 "num_base_bdevs_operational": 3, 00:25:56.440 "process": { 00:25:56.440 "type": "rebuild", 00:25:56.440 "target": "spare", 00:25:56.440 "progress": { 00:25:56.440 "blocks": 36864, 00:25:56.440 "percent": 58 00:25:56.440 } 00:25:56.440 }, 00:25:56.440 "base_bdevs_list": [ 00:25:56.440 { 00:25:56.440 "name": "spare", 00:25:56.440 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:56.440 "is_configured": true, 00:25:56.440 "data_offset": 2048, 00:25:56.440 "data_size": 63488 00:25:56.440 }, 00:25:56.440 { 00:25:56.440 "name": null, 00:25:56.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.440 "is_configured": false, 00:25:56.440 "data_offset": 2048, 00:25:56.440 "data_size": 63488 00:25:56.440 }, 00:25:56.440 { 00:25:56.440 "name": "BaseBdev3", 00:25:56.440 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:56.440 "is_configured": true, 00:25:56.440 "data_offset": 2048, 00:25:56.440 "data_size": 63488 00:25:56.440 }, 00:25:56.440 { 00:25:56.440 "name": "BaseBdev4", 00:25:56.440 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:56.440 "is_configured": true, 00:25:56.440 "data_offset": 2048, 00:25:56.440 "data_size": 63488 00:25:56.440 } 00:25:56.440 ] 00:25:56.440 }' 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=866 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.440 07:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.699 07:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.699 "name": "raid_bdev1", 00:25:56.699 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:56.699 "strip_size_kb": 0, 00:25:56.699 "state": "online", 00:25:56.699 "raid_level": "raid1", 00:25:56.699 "superblock": true, 00:25:56.699 "num_base_bdevs": 4, 00:25:56.699 "num_base_bdevs_discovered": 3, 00:25:56.699 "num_base_bdevs_operational": 3, 00:25:56.699 "process": { 00:25:56.699 "type": "rebuild", 00:25:56.699 "target": "spare", 00:25:56.699 "progress": { 00:25:56.699 "blocks": 43008, 00:25:56.699 "percent": 67 00:25:56.699 } 00:25:56.699 }, 00:25:56.699 "base_bdevs_list": [ 00:25:56.699 { 00:25:56.699 "name": "spare", 00:25:56.699 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:56.700 "is_configured": true, 00:25:56.700 "data_offset": 2048, 00:25:56.700 "data_size": 63488 00:25:56.700 }, 00:25:56.700 { 00:25:56.700 "name": null, 00:25:56.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.700 "is_configured": false, 00:25:56.700 "data_offset": 2048, 00:25:56.700 "data_size": 63488 00:25:56.700 }, 00:25:56.700 { 00:25:56.700 "name": "BaseBdev3", 00:25:56.700 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:56.700 "is_configured": true, 00:25:56.700 "data_offset": 2048, 00:25:56.700 "data_size": 63488 00:25:56.700 }, 00:25:56.700 { 00:25:56.700 "name": "BaseBdev4", 00:25:56.700 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:56.700 "is_configured": true, 00:25:56.700 "data_offset": 2048, 00:25:56.700 "data_size": 63488 00:25:56.700 } 00:25:56.700 ] 00:25:56.700 }' 00:25:56.700 07:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.700 07:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:56.700 07:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.700 07:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:56.700 07:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:57.634 [2024-07-25 07:31:29.986747] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:57.634 [2024-07-25 07:31:29.986804] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:57.634 [2024-07-25 07:31:29.986896] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.634 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.892 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.892 "name": "raid_bdev1", 00:25:57.892 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:57.892 "strip_size_kb": 0, 00:25:57.892 "state": "online", 00:25:57.892 "raid_level": "raid1", 00:25:57.892 "superblock": true, 00:25:57.892 "num_base_bdevs": 4, 00:25:57.892 "num_base_bdevs_discovered": 3, 00:25:57.892 "num_base_bdevs_operational": 3, 00:25:57.892 "base_bdevs_list": [ 00:25:57.892 { 00:25:57.892 "name": "spare", 00:25:57.892 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:57.892 "is_configured": true, 00:25:57.892 "data_offset": 2048, 00:25:57.892 "data_size": 63488 00:25:57.892 }, 00:25:57.892 { 00:25:57.892 "name": null, 00:25:57.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.892 "is_configured": false, 00:25:57.892 "data_offset": 2048, 00:25:57.892 "data_size": 63488 00:25:57.892 }, 00:25:57.892 { 00:25:57.892 "name": "BaseBdev3", 00:25:57.892 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:57.892 "is_configured": true, 00:25:57.892 "data_offset": 2048, 00:25:57.892 "data_size": 63488 00:25:57.892 }, 00:25:57.892 { 00:25:57.892 "name": "BaseBdev4", 00:25:57.892 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:57.892 "is_configured": true, 00:25:57.892 "data_offset": 2048, 00:25:57.892 "data_size": 63488 00:25:57.892 } 00:25:57.892 ] 00:25:57.892 }' 00:25:57.892 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.892 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:57.892 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.150 "name": "raid_bdev1", 00:25:58.150 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:58.150 "strip_size_kb": 0, 00:25:58.150 "state": "online", 00:25:58.150 "raid_level": "raid1", 00:25:58.150 "superblock": true, 00:25:58.150 "num_base_bdevs": 4, 00:25:58.150 "num_base_bdevs_discovered": 3, 00:25:58.150 "num_base_bdevs_operational": 3, 00:25:58.150 "base_bdevs_list": [ 00:25:58.150 { 00:25:58.150 "name": "spare", 00:25:58.150 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:58.150 "is_configured": true, 00:25:58.150 "data_offset": 2048, 00:25:58.150 "data_size": 63488 00:25:58.150 }, 00:25:58.150 { 00:25:58.150 "name": null, 00:25:58.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.150 "is_configured": false, 00:25:58.150 "data_offset": 2048, 00:25:58.150 "data_size": 63488 00:25:58.150 }, 00:25:58.150 { 00:25:58.150 "name": "BaseBdev3", 00:25:58.150 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:58.150 "is_configured": true, 00:25:58.150 "data_offset": 2048, 00:25:58.150 "data_size": 63488 00:25:58.150 }, 00:25:58.150 { 00:25:58.150 "name": "BaseBdev4", 00:25:58.150 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:58.150 "is_configured": true, 00:25:58.150 "data_offset": 2048, 00:25:58.150 "data_size": 63488 00:25:58.150 } 00:25:58.150 ] 00:25:58.150 }' 00:25:58.150 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.408 "name": "raid_bdev1", 00:25:58.408 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:25:58.408 "strip_size_kb": 0, 00:25:58.408 "state": "online", 00:25:58.408 "raid_level": "raid1", 00:25:58.408 "superblock": true, 00:25:58.408 "num_base_bdevs": 4, 00:25:58.408 "num_base_bdevs_discovered": 3, 00:25:58.408 "num_base_bdevs_operational": 3, 00:25:58.408 "base_bdevs_list": [ 00:25:58.408 { 00:25:58.408 "name": "spare", 00:25:58.408 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:25:58.408 "is_configured": true, 00:25:58.408 "data_offset": 2048, 00:25:58.408 "data_size": 63488 00:25:58.408 }, 00:25:58.408 { 00:25:58.408 "name": null, 00:25:58.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.408 "is_configured": false, 00:25:58.408 "data_offset": 2048, 00:25:58.408 "data_size": 63488 00:25:58.408 }, 00:25:58.408 { 00:25:58.408 "name": "BaseBdev3", 00:25:58.408 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:25:58.408 "is_configured": true, 00:25:58.408 "data_offset": 2048, 00:25:58.408 "data_size": 63488 00:25:58.408 }, 00:25:58.408 { 00:25:58.408 "name": "BaseBdev4", 00:25:58.408 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:25:58.408 "is_configured": true, 00:25:58.408 "data_offset": 2048, 00:25:58.408 "data_size": 63488 00:25:58.408 } 00:25:58.408 ] 00:25:58.408 }' 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.408 07:31:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:58.974 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:59.232 [2024-07-25 07:31:31.694970] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:59.232 [2024-07-25 07:31:31.694997] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:59.232 [2024-07-25 07:31:31.695053] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:59.232 [2024-07-25 07:31:31.695119] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:59.232 [2024-07-25 07:31:31.695130] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21432f0 name raid_bdev1, state offline 00:25:59.232 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:25:59.232 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:59.490 07:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:59.748 /dev/nbd0 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.748 1+0 records in 00:25:59.748 1+0 records out 00:25:59.748 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024281 s, 16.9 MB/s 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:59.748 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:59.749 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:00.008 /dev/nbd1 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:00.008 1+0 records in 00:26:00.008 1+0 records out 00:26:00.008 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305823 s, 13.4 MB/s 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:00.008 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:26:00.009 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:00.009 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:00.009 07:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:26:00.009 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:00.009 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:00.009 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.266 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:00.524 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:00.524 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.524 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.524 07:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:00.524 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:00.524 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:00.524 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:00.524 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.524 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.524 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:00.782 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:00.782 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.782 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:26:00.783 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:00.783 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:01.040 [2024-07-25 07:31:33.496585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:01.040 [2024-07-25 07:31:33.496630] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:01.040 [2024-07-25 07:31:33.496650] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20a2600 00:26:01.040 [2024-07-25 07:31:33.496662] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:01.040 [2024-07-25 07:31:33.498203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:01.040 [2024-07-25 07:31:33.498232] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:01.040 [2024-07-25 07:31:33.498305] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:01.040 [2024-07-25 07:31:33.498331] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:01.040 [2024-07-25 07:31:33.498426] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:01.040 [2024-07-25 07:31:33.498494] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:01.040 spare 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.040 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.297 [2024-07-25 07:31:33.598802] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2098080 00:26:01.297 [2024-07-25 07:31:33.598817] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:01.297 [2024-07-25 07:31:33.598992] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2097290 00:26:01.297 [2024-07-25 07:31:33.599129] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2098080 00:26:01.297 [2024-07-25 07:31:33.599145] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2098080 00:26:01.297 [2024-07-25 07:31:33.599240] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:01.297 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.297 "name": "raid_bdev1", 00:26:01.297 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:01.297 "strip_size_kb": 0, 00:26:01.297 "state": "online", 00:26:01.297 "raid_level": "raid1", 00:26:01.297 "superblock": true, 00:26:01.297 "num_base_bdevs": 4, 00:26:01.297 "num_base_bdevs_discovered": 3, 00:26:01.297 "num_base_bdevs_operational": 3, 00:26:01.297 "base_bdevs_list": [ 00:26:01.297 { 00:26:01.297 "name": "spare", 00:26:01.297 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:26:01.297 "is_configured": true, 00:26:01.297 "data_offset": 2048, 00:26:01.297 "data_size": 63488 00:26:01.297 }, 00:26:01.297 { 00:26:01.297 "name": null, 00:26:01.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.297 "is_configured": false, 00:26:01.297 "data_offset": 2048, 00:26:01.297 "data_size": 63488 00:26:01.297 }, 00:26:01.297 { 00:26:01.297 "name": "BaseBdev3", 00:26:01.297 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:01.297 "is_configured": true, 00:26:01.297 "data_offset": 2048, 00:26:01.297 "data_size": 63488 00:26:01.297 }, 00:26:01.297 { 00:26:01.297 "name": "BaseBdev4", 00:26:01.297 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:01.297 "is_configured": true, 00:26:01.297 "data_offset": 2048, 00:26:01.297 "data_size": 63488 00:26:01.297 } 00:26:01.297 ] 00:26:01.297 }' 00:26:01.297 07:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.297 07:31:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.863 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.122 "name": "raid_bdev1", 00:26:02.122 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:02.122 "strip_size_kb": 0, 00:26:02.122 "state": "online", 00:26:02.122 "raid_level": "raid1", 00:26:02.122 "superblock": true, 00:26:02.122 "num_base_bdevs": 4, 00:26:02.122 "num_base_bdevs_discovered": 3, 00:26:02.122 "num_base_bdevs_operational": 3, 00:26:02.122 "base_bdevs_list": [ 00:26:02.122 { 00:26:02.122 "name": "spare", 00:26:02.122 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:26:02.122 "is_configured": true, 00:26:02.122 "data_offset": 2048, 00:26:02.122 "data_size": 63488 00:26:02.122 }, 00:26:02.122 { 00:26:02.122 "name": null, 00:26:02.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.122 "is_configured": false, 00:26:02.122 "data_offset": 2048, 00:26:02.122 "data_size": 63488 00:26:02.122 }, 00:26:02.122 { 00:26:02.122 "name": "BaseBdev3", 00:26:02.122 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:02.122 "is_configured": true, 00:26:02.122 "data_offset": 2048, 00:26:02.122 "data_size": 63488 00:26:02.122 }, 00:26:02.122 { 00:26:02.122 "name": "BaseBdev4", 00:26:02.122 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:02.122 "is_configured": true, 00:26:02.122 "data_offset": 2048, 00:26:02.122 "data_size": 63488 00:26:02.122 } 00:26:02.122 ] 00:26:02.122 }' 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.122 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:02.380 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:26:02.380 07:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:02.639 [2024-07-25 07:31:35.056869] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.639 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.898 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:02.898 "name": "raid_bdev1", 00:26:02.898 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:02.898 "strip_size_kb": 0, 00:26:02.898 "state": "online", 00:26:02.898 "raid_level": "raid1", 00:26:02.898 "superblock": true, 00:26:02.898 "num_base_bdevs": 4, 00:26:02.898 "num_base_bdevs_discovered": 2, 00:26:02.898 "num_base_bdevs_operational": 2, 00:26:02.898 "base_bdevs_list": [ 00:26:02.898 { 00:26:02.898 "name": null, 00:26:02.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.898 "is_configured": false, 00:26:02.898 "data_offset": 2048, 00:26:02.898 "data_size": 63488 00:26:02.898 }, 00:26:02.898 { 00:26:02.898 "name": null, 00:26:02.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.898 "is_configured": false, 00:26:02.898 "data_offset": 2048, 00:26:02.898 "data_size": 63488 00:26:02.898 }, 00:26:02.898 { 00:26:02.898 "name": "BaseBdev3", 00:26:02.898 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:02.898 "is_configured": true, 00:26:02.898 "data_offset": 2048, 00:26:02.898 "data_size": 63488 00:26:02.898 }, 00:26:02.898 { 00:26:02.898 "name": "BaseBdev4", 00:26:02.898 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:02.898 "is_configured": true, 00:26:02.898 "data_offset": 2048, 00:26:02.898 "data_size": 63488 00:26:02.898 } 00:26:02.898 ] 00:26:02.898 }' 00:26:02.898 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:02.898 07:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:03.465 07:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:03.724 [2024-07-25 07:31:36.079578] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:03.724 [2024-07-25 07:31:36.079721] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:03.724 [2024-07-25 07:31:36.079738] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:03.724 [2024-07-25 07:31:36.079766] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:03.724 [2024-07-25 07:31:36.083592] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2099210 00:26:03.724 [2024-07-25 07:31:36.084858] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:03.724 07:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.658 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.916 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.916 "name": "raid_bdev1", 00:26:04.916 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:04.916 "strip_size_kb": 0, 00:26:04.916 "state": "online", 00:26:04.916 "raid_level": "raid1", 00:26:04.916 "superblock": true, 00:26:04.916 "num_base_bdevs": 4, 00:26:04.916 "num_base_bdevs_discovered": 3, 00:26:04.916 "num_base_bdevs_operational": 3, 00:26:04.916 "process": { 00:26:04.916 "type": "rebuild", 00:26:04.916 "target": "spare", 00:26:04.916 "progress": { 00:26:04.916 "blocks": 24576, 00:26:04.916 "percent": 38 00:26:04.916 } 00:26:04.916 }, 00:26:04.916 "base_bdevs_list": [ 00:26:04.916 { 00:26:04.916 "name": "spare", 00:26:04.916 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:26:04.916 "is_configured": true, 00:26:04.916 "data_offset": 2048, 00:26:04.916 "data_size": 63488 00:26:04.916 }, 00:26:04.916 { 00:26:04.916 "name": null, 00:26:04.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.916 "is_configured": false, 00:26:04.916 "data_offset": 2048, 00:26:04.916 "data_size": 63488 00:26:04.916 }, 00:26:04.916 { 00:26:04.916 "name": "BaseBdev3", 00:26:04.916 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:04.916 "is_configured": true, 00:26:04.916 "data_offset": 2048, 00:26:04.916 "data_size": 63488 00:26:04.916 }, 00:26:04.916 { 00:26:04.916 "name": "BaseBdev4", 00:26:04.916 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:04.917 "is_configured": true, 00:26:04.917 "data_offset": 2048, 00:26:04.917 "data_size": 63488 00:26:04.917 } 00:26:04.917 ] 00:26:04.917 }' 00:26:04.917 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.917 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:04.917 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.917 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:04.917 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:05.175 [2024-07-25 07:31:37.624532] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:05.175 [2024-07-25 07:31:37.696532] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:05.175 [2024-07-25 07:31:37.696573] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:05.175 [2024-07-25 07:31:37.696587] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:05.176 [2024-07-25 07:31:37.696595] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:05.435 "name": "raid_bdev1", 00:26:05.435 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:05.435 "strip_size_kb": 0, 00:26:05.435 "state": "online", 00:26:05.435 "raid_level": "raid1", 00:26:05.435 "superblock": true, 00:26:05.435 "num_base_bdevs": 4, 00:26:05.435 "num_base_bdevs_discovered": 2, 00:26:05.435 "num_base_bdevs_operational": 2, 00:26:05.435 "base_bdevs_list": [ 00:26:05.435 { 00:26:05.435 "name": null, 00:26:05.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.435 "is_configured": false, 00:26:05.435 "data_offset": 2048, 00:26:05.435 "data_size": 63488 00:26:05.435 }, 00:26:05.435 { 00:26:05.435 "name": null, 00:26:05.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.435 "is_configured": false, 00:26:05.435 "data_offset": 2048, 00:26:05.435 "data_size": 63488 00:26:05.435 }, 00:26:05.435 { 00:26:05.435 "name": "BaseBdev3", 00:26:05.435 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:05.435 "is_configured": true, 00:26:05.435 "data_offset": 2048, 00:26:05.435 "data_size": 63488 00:26:05.435 }, 00:26:05.435 { 00:26:05.435 "name": "BaseBdev4", 00:26:05.435 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:05.435 "is_configured": true, 00:26:05.435 "data_offset": 2048, 00:26:05.435 "data_size": 63488 00:26:05.435 } 00:26:05.435 ] 00:26:05.435 }' 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:05.435 07:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:06.001 07:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:06.260 [2024-07-25 07:31:38.731028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:06.260 [2024-07-25 07:31:38.731078] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.260 [2024-07-25 07:31:38.731099] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x209b090 00:26:06.260 [2024-07-25 07:31:38.731110] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.260 [2024-07-25 07:31:38.731461] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.260 [2024-07-25 07:31:38.731479] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:06.260 [2024-07-25 07:31:38.731552] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:06.260 [2024-07-25 07:31:38.731563] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:06.260 [2024-07-25 07:31:38.731575] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:06.260 [2024-07-25 07:31:38.731592] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:06.260 [2024-07-25 07:31:38.735406] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x213fdd0 00:26:06.260 spare 00:26:06.260 [2024-07-25 07:31:38.736672] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:06.260 07:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.635 "name": "raid_bdev1", 00:26:07.635 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:07.635 "strip_size_kb": 0, 00:26:07.635 "state": "online", 00:26:07.635 "raid_level": "raid1", 00:26:07.635 "superblock": true, 00:26:07.635 "num_base_bdevs": 4, 00:26:07.635 "num_base_bdevs_discovered": 3, 00:26:07.635 "num_base_bdevs_operational": 3, 00:26:07.635 "process": { 00:26:07.635 "type": "rebuild", 00:26:07.635 "target": "spare", 00:26:07.635 "progress": { 00:26:07.635 "blocks": 24576, 00:26:07.635 "percent": 38 00:26:07.635 } 00:26:07.635 }, 00:26:07.635 "base_bdevs_list": [ 00:26:07.635 { 00:26:07.635 "name": "spare", 00:26:07.635 "uuid": "d9347e4b-af15-581f-be4a-5eec6bcf10fe", 00:26:07.635 "is_configured": true, 00:26:07.635 "data_offset": 2048, 00:26:07.635 "data_size": 63488 00:26:07.635 }, 00:26:07.635 { 00:26:07.635 "name": null, 00:26:07.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.635 "is_configured": false, 00:26:07.635 "data_offset": 2048, 00:26:07.635 "data_size": 63488 00:26:07.635 }, 00:26:07.635 { 00:26:07.635 "name": "BaseBdev3", 00:26:07.635 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:07.635 "is_configured": true, 00:26:07.635 "data_offset": 2048, 00:26:07.635 "data_size": 63488 00:26:07.635 }, 00:26:07.635 { 00:26:07.635 "name": "BaseBdev4", 00:26:07.635 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:07.635 "is_configured": true, 00:26:07.635 "data_offset": 2048, 00:26:07.635 "data_size": 63488 00:26:07.635 } 00:26:07.635 ] 00:26:07.635 }' 00:26:07.635 07:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.635 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:07.635 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.635 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:07.635 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:07.894 [2024-07-25 07:31:40.284347] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.894 [2024-07-25 07:31:40.348390] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:07.894 [2024-07-25 07:31:40.348430] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.894 [2024-07-25 07:31:40.348445] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.894 [2024-07-25 07:31:40.348453] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.894 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.152 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.152 "name": "raid_bdev1", 00:26:08.152 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:08.152 "strip_size_kb": 0, 00:26:08.152 "state": "online", 00:26:08.152 "raid_level": "raid1", 00:26:08.152 "superblock": true, 00:26:08.152 "num_base_bdevs": 4, 00:26:08.152 "num_base_bdevs_discovered": 2, 00:26:08.152 "num_base_bdevs_operational": 2, 00:26:08.152 "base_bdevs_list": [ 00:26:08.152 { 00:26:08.152 "name": null, 00:26:08.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.152 "is_configured": false, 00:26:08.152 "data_offset": 2048, 00:26:08.152 "data_size": 63488 00:26:08.152 }, 00:26:08.152 { 00:26:08.152 "name": null, 00:26:08.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.152 "is_configured": false, 00:26:08.152 "data_offset": 2048, 00:26:08.152 "data_size": 63488 00:26:08.152 }, 00:26:08.152 { 00:26:08.152 "name": "BaseBdev3", 00:26:08.152 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:08.152 "is_configured": true, 00:26:08.152 "data_offset": 2048, 00:26:08.152 "data_size": 63488 00:26:08.152 }, 00:26:08.152 { 00:26:08.152 "name": "BaseBdev4", 00:26:08.152 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:08.152 "is_configured": true, 00:26:08.152 "data_offset": 2048, 00:26:08.152 "data_size": 63488 00:26:08.152 } 00:26:08.152 ] 00:26:08.152 }' 00:26:08.152 07:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.152 07:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.719 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.977 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.977 "name": "raid_bdev1", 00:26:08.977 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:08.977 "strip_size_kb": 0, 00:26:08.977 "state": "online", 00:26:08.977 "raid_level": "raid1", 00:26:08.977 "superblock": true, 00:26:08.977 "num_base_bdevs": 4, 00:26:08.977 "num_base_bdevs_discovered": 2, 00:26:08.977 "num_base_bdevs_operational": 2, 00:26:08.977 "base_bdevs_list": [ 00:26:08.977 { 00:26:08.977 "name": null, 00:26:08.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.977 "is_configured": false, 00:26:08.977 "data_offset": 2048, 00:26:08.977 "data_size": 63488 00:26:08.977 }, 00:26:08.977 { 00:26:08.977 "name": null, 00:26:08.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.977 "is_configured": false, 00:26:08.977 "data_offset": 2048, 00:26:08.977 "data_size": 63488 00:26:08.977 }, 00:26:08.977 { 00:26:08.977 "name": "BaseBdev3", 00:26:08.977 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:08.977 "is_configured": true, 00:26:08.977 "data_offset": 2048, 00:26:08.977 "data_size": 63488 00:26:08.977 }, 00:26:08.977 { 00:26:08.977 "name": "BaseBdev4", 00:26:08.977 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:08.977 "is_configured": true, 00:26:08.977 "data_offset": 2048, 00:26:08.977 "data_size": 63488 00:26:08.977 } 00:26:08.977 ] 00:26:08.977 }' 00:26:08.977 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.977 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:08.977 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.977 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:08.977 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:09.235 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:09.494 [2024-07-25 07:31:41.952917] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:09.494 [2024-07-25 07:31:41.952963] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:09.494 [2024-07-25 07:31:41.952982] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2099150 00:26:09.494 [2024-07-25 07:31:41.952994] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:09.494 [2024-07-25 07:31:41.953322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:09.494 [2024-07-25 07:31:41.953340] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:09.494 [2024-07-25 07:31:41.953399] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:09.494 [2024-07-25 07:31:41.953411] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:09.494 [2024-07-25 07:31:41.953422] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:09.494 BaseBdev1 00:26:09.494 07:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.869 07:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.869 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.869 "name": "raid_bdev1", 00:26:10.869 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:10.869 "strip_size_kb": 0, 00:26:10.869 "state": "online", 00:26:10.869 "raid_level": "raid1", 00:26:10.869 "superblock": true, 00:26:10.869 "num_base_bdevs": 4, 00:26:10.869 "num_base_bdevs_discovered": 2, 00:26:10.869 "num_base_bdevs_operational": 2, 00:26:10.869 "base_bdevs_list": [ 00:26:10.869 { 00:26:10.869 "name": null, 00:26:10.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.869 "is_configured": false, 00:26:10.869 "data_offset": 2048, 00:26:10.869 "data_size": 63488 00:26:10.869 }, 00:26:10.869 { 00:26:10.869 "name": null, 00:26:10.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.869 "is_configured": false, 00:26:10.869 "data_offset": 2048, 00:26:10.869 "data_size": 63488 00:26:10.869 }, 00:26:10.869 { 00:26:10.869 "name": "BaseBdev3", 00:26:10.869 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:10.869 "is_configured": true, 00:26:10.869 "data_offset": 2048, 00:26:10.869 "data_size": 63488 00:26:10.869 }, 00:26:10.869 { 00:26:10.869 "name": "BaseBdev4", 00:26:10.869 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:10.869 "is_configured": true, 00:26:10.869 "data_offset": 2048, 00:26:10.869 "data_size": 63488 00:26:10.869 } 00:26:10.869 ] 00:26:10.869 }' 00:26:10.869 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.869 07:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.434 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.694 07:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.694 "name": "raid_bdev1", 00:26:11.694 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:11.694 "strip_size_kb": 0, 00:26:11.694 "state": "online", 00:26:11.694 "raid_level": "raid1", 00:26:11.694 "superblock": true, 00:26:11.694 "num_base_bdevs": 4, 00:26:11.694 "num_base_bdevs_discovered": 2, 00:26:11.694 "num_base_bdevs_operational": 2, 00:26:11.694 "base_bdevs_list": [ 00:26:11.694 { 00:26:11.694 "name": null, 00:26:11.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.694 "is_configured": false, 00:26:11.694 "data_offset": 2048, 00:26:11.694 "data_size": 63488 00:26:11.694 }, 00:26:11.694 { 00:26:11.694 "name": null, 00:26:11.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.694 "is_configured": false, 00:26:11.694 "data_offset": 2048, 00:26:11.694 "data_size": 63488 00:26:11.694 }, 00:26:11.694 { 00:26:11.694 "name": "BaseBdev3", 00:26:11.694 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:11.694 "is_configured": true, 00:26:11.694 "data_offset": 2048, 00:26:11.694 "data_size": 63488 00:26:11.694 }, 00:26:11.694 { 00:26:11.694 "name": "BaseBdev4", 00:26:11.694 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:11.694 "is_configured": true, 00:26:11.694 "data_offset": 2048, 00:26:11.694 "data_size": 63488 00:26:11.694 } 00:26:11.694 ] 00:26:11.694 }' 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:11.694 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:11.952 [2024-07-25 07:31:44.307126] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:11.952 [2024-07-25 07:31:44.307250] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:11.952 [2024-07-25 07:31:44.307266] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:11.952 request: 00:26:11.952 { 00:26:11.952 "base_bdev": "BaseBdev1", 00:26:11.952 "raid_bdev": "raid_bdev1", 00:26:11.952 "method": "bdev_raid_add_base_bdev", 00:26:11.952 "req_id": 1 00:26:11.952 } 00:26:11.952 Got JSON-RPC error response 00:26:11.952 response: 00:26:11.952 { 00:26:11.952 "code": -22, 00:26:11.952 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:11.952 } 00:26:11.952 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:26:11.952 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:26:11.952 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:26:11.952 07:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:26:11.952 07:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:26:12.926 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.927 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.185 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.185 "name": "raid_bdev1", 00:26:13.185 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:13.185 "strip_size_kb": 0, 00:26:13.185 "state": "online", 00:26:13.185 "raid_level": "raid1", 00:26:13.185 "superblock": true, 00:26:13.185 "num_base_bdevs": 4, 00:26:13.185 "num_base_bdevs_discovered": 2, 00:26:13.185 "num_base_bdevs_operational": 2, 00:26:13.185 "base_bdevs_list": [ 00:26:13.185 { 00:26:13.185 "name": null, 00:26:13.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.185 "is_configured": false, 00:26:13.185 "data_offset": 2048, 00:26:13.185 "data_size": 63488 00:26:13.185 }, 00:26:13.185 { 00:26:13.186 "name": null, 00:26:13.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.186 "is_configured": false, 00:26:13.186 "data_offset": 2048, 00:26:13.186 "data_size": 63488 00:26:13.186 }, 00:26:13.186 { 00:26:13.186 "name": "BaseBdev3", 00:26:13.186 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:13.186 "is_configured": true, 00:26:13.186 "data_offset": 2048, 00:26:13.186 "data_size": 63488 00:26:13.186 }, 00:26:13.186 { 00:26:13.186 "name": "BaseBdev4", 00:26:13.186 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:13.186 "is_configured": true, 00:26:13.186 "data_offset": 2048, 00:26:13.186 "data_size": 63488 00:26:13.186 } 00:26:13.186 ] 00:26:13.186 }' 00:26:13.186 07:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.186 07:31:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.753 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.012 "name": "raid_bdev1", 00:26:14.012 "uuid": "fac5dac2-4d23-4e9e-916d-89c2f2868fdd", 00:26:14.012 "strip_size_kb": 0, 00:26:14.012 "state": "online", 00:26:14.012 "raid_level": "raid1", 00:26:14.012 "superblock": true, 00:26:14.012 "num_base_bdevs": 4, 00:26:14.012 "num_base_bdevs_discovered": 2, 00:26:14.012 "num_base_bdevs_operational": 2, 00:26:14.012 "base_bdevs_list": [ 00:26:14.012 { 00:26:14.012 "name": null, 00:26:14.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.012 "is_configured": false, 00:26:14.012 "data_offset": 2048, 00:26:14.012 "data_size": 63488 00:26:14.012 }, 00:26:14.012 { 00:26:14.012 "name": null, 00:26:14.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.012 "is_configured": false, 00:26:14.012 "data_offset": 2048, 00:26:14.012 "data_size": 63488 00:26:14.012 }, 00:26:14.012 { 00:26:14.012 "name": "BaseBdev3", 00:26:14.012 "uuid": "cff66dc6-8bad-5597-970d-36b8e0516cc5", 00:26:14.012 "is_configured": true, 00:26:14.012 "data_offset": 2048, 00:26:14.012 "data_size": 63488 00:26:14.012 }, 00:26:14.012 { 00:26:14.012 "name": "BaseBdev4", 00:26:14.012 "uuid": "fdf699cc-c5d1-5cd8-8099-b5064da291a2", 00:26:14.012 "is_configured": true, 00:26:14.012 "data_offset": 2048, 00:26:14.012 "data_size": 63488 00:26:14.012 } 00:26:14.012 ] 00:26:14.012 }' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1737476 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1737476 ']' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1737476 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1737476 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1737476' 00:26:14.012 killing process with pid 1737476 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1737476 00:26:14.012 Received shutdown signal, test time was about 60.000000 seconds 00:26:14.012 00:26:14.012 Latency(us) 00:26:14.012 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:14.012 =================================================================================================================== 00:26:14.012 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:14.012 [2024-07-25 07:31:46.523272] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:14.012 [2024-07-25 07:31:46.523362] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:14.012 [2024-07-25 07:31:46.523416] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:14.012 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1737476 00:26:14.012 [2024-07-25 07:31:46.523428] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2098080 name raid_bdev1, state offline 00:26:14.271 [2024-07-25 07:31:46.563309] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:14.271 07:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:26:14.271 00:26:14.271 real 0m35.737s 00:26:14.271 user 0m52.005s 00:26:14.271 sys 0m6.258s 00:26:14.271 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:14.271 07:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:14.271 ************************************ 00:26:14.271 END TEST raid_rebuild_test_sb 00:26:14.271 ************************************ 00:26:14.271 07:31:46 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:26:14.271 07:31:46 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:14.271 07:31:46 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:14.271 07:31:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:14.530 ************************************ 00:26:14.530 START TEST raid_rebuild_test_io 00:26:14.530 ************************************ 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false true true 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1743780 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1743780 /var/tmp/spdk-raid.sock 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1743780 ']' 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:14.530 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:14.530 07:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:14.530 [2024-07-25 07:31:46.903818] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:26:14.530 [2024-07-25 07:31:46.903874] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1743780 ] 00:26:14.530 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:14.530 Zero copy mechanism will not be used. 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.0 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.1 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.2 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.3 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.4 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.5 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.6 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:01.7 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.0 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.1 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.2 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.3 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.4 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.5 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.6 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3d:02.7 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3f:01.0 cannot be used 00:26:14.530 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.530 EAL: Requested device 0000:3f:01.1 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:01.2 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:01.3 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:01.4 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:01.5 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:01.6 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:01.7 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.0 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.1 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.2 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.3 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.4 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.5 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.6 cannot be used 00:26:14.531 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:14.531 EAL: Requested device 0000:3f:02.7 cannot be used 00:26:14.531 [2024-07-25 07:31:47.037337] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:14.789 [2024-07-25 07:31:47.124614] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:26:14.789 [2024-07-25 07:31:47.183458] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:14.789 [2024-07-25 07:31:47.183492] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:15.356 07:31:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:15.356 07:31:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:26:15.356 07:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:15.356 07:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:15.614 BaseBdev1_malloc 00:26:15.614 07:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:15.614 [2024-07-25 07:31:48.091244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:15.614 [2024-07-25 07:31:48.091289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:15.614 [2024-07-25 07:31:48.091310] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa35690 00:26:15.614 [2024-07-25 07:31:48.091326] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:15.614 [2024-07-25 07:31:48.092797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:15.614 [2024-07-25 07:31:48.092824] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:15.614 BaseBdev1 00:26:15.614 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:15.614 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:15.873 BaseBdev2_malloc 00:26:15.873 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:16.131 [2024-07-25 07:31:48.468674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:16.131 [2024-07-25 07:31:48.468714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.131 [2024-07-25 07:31:48.468736] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa36050 00:26:16.131 [2024-07-25 07:31:48.468748] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.131 [2024-07-25 07:31:48.470105] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.131 [2024-07-25 07:31:48.470132] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:16.131 BaseBdev2 00:26:16.131 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:16.131 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:16.131 BaseBdev3_malloc 00:26:16.131 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:16.389 [2024-07-25 07:31:48.841992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:16.389 [2024-07-25 07:31:48.842033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.389 [2024-07-25 07:31:48.842053] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad4280 00:26:16.390 [2024-07-25 07:31:48.842064] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.390 [2024-07-25 07:31:48.843443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.390 [2024-07-25 07:31:48.843470] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:16.390 BaseBdev3 00:26:16.390 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:16.390 07:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:16.648 BaseBdev4_malloc 00:26:16.648 07:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:16.907 [2024-07-25 07:31:49.291544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:16.907 [2024-07-25 07:31:49.291586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.907 [2024-07-25 07:31:49.291605] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad7600 00:26:16.907 [2024-07-25 07:31:49.291616] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.907 [2024-07-25 07:31:49.292954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.907 [2024-07-25 07:31:49.292980] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:16.907 BaseBdev4 00:26:16.907 07:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:17.165 spare_malloc 00:26:17.165 07:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:17.423 spare_delay 00:26:17.424 07:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:17.682 [2024-07-25 07:31:49.965508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:17.682 [2024-07-25 07:31:49.965548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:17.682 [2024-07-25 07:31:49.965568] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xae14c0 00:26:17.682 [2024-07-25 07:31:49.965580] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:17.682 [2024-07-25 07:31:49.966918] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:17.682 [2024-07-25 07:31:49.966944] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:17.682 spare 00:26:17.682 07:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:17.682 [2024-07-25 07:31:50.194130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:17.682 [2024-07-25 07:31:50.195359] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:17.682 [2024-07-25 07:31:50.195409] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:17.682 [2024-07-25 07:31:50.195450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:17.682 [2024-07-25 07:31:50.195530] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xad82f0 00:26:17.682 [2024-07-25 07:31:50.195540] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:17.682 [2024-07-25 07:31:50.195739] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad8960 00:26:17.682 [2024-07-25 07:31:50.195883] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad82f0 00:26:17.682 [2024-07-25 07:31:50.195892] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xad82f0 00:26:17.682 [2024-07-25 07:31:50.195996] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.682 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.941 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.941 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.941 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.941 "name": "raid_bdev1", 00:26:17.941 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:17.941 "strip_size_kb": 0, 00:26:17.941 "state": "online", 00:26:17.941 "raid_level": "raid1", 00:26:17.941 "superblock": false, 00:26:17.941 "num_base_bdevs": 4, 00:26:17.941 "num_base_bdevs_discovered": 4, 00:26:17.941 "num_base_bdevs_operational": 4, 00:26:17.941 "base_bdevs_list": [ 00:26:17.941 { 00:26:17.941 "name": "BaseBdev1", 00:26:17.941 "uuid": "aaa89811-442e-5761-a5de-0f9d5f7f5ebe", 00:26:17.941 "is_configured": true, 00:26:17.941 "data_offset": 0, 00:26:17.941 "data_size": 65536 00:26:17.941 }, 00:26:17.941 { 00:26:17.941 "name": "BaseBdev2", 00:26:17.941 "uuid": "5bf6919e-f862-5597-9c63-21a466cf0c94", 00:26:17.941 "is_configured": true, 00:26:17.941 "data_offset": 0, 00:26:17.941 "data_size": 65536 00:26:17.941 }, 00:26:17.941 { 00:26:17.941 "name": "BaseBdev3", 00:26:17.941 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:17.941 "is_configured": true, 00:26:17.941 "data_offset": 0, 00:26:17.941 "data_size": 65536 00:26:17.941 }, 00:26:17.941 { 00:26:17.941 "name": "BaseBdev4", 00:26:17.941 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:17.941 "is_configured": true, 00:26:17.941 "data_offset": 0, 00:26:17.941 "data_size": 65536 00:26:17.941 } 00:26:17.941 ] 00:26:17.941 }' 00:26:17.941 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.941 07:31:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:18.508 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:18.508 07:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:18.766 [2024-07-25 07:31:51.140895] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:18.767 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:26:18.767 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.767 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:19.025 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:26:19.025 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:26:19.025 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:19.025 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:19.025 [2024-07-25 07:31:51.495521] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa2f430 00:26:19.025 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:19.025 Zero copy mechanism will not be used. 00:26:19.025 Running I/O for 60 seconds... 00:26:19.284 [2024-07-25 07:31:51.612095] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:19.284 [2024-07-25 07:31:51.619603] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa2f430 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.284 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.543 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.543 "name": "raid_bdev1", 00:26:19.543 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:19.543 "strip_size_kb": 0, 00:26:19.543 "state": "online", 00:26:19.543 "raid_level": "raid1", 00:26:19.543 "superblock": false, 00:26:19.543 "num_base_bdevs": 4, 00:26:19.543 "num_base_bdevs_discovered": 3, 00:26:19.543 "num_base_bdevs_operational": 3, 00:26:19.543 "base_bdevs_list": [ 00:26:19.543 { 00:26:19.543 "name": null, 00:26:19.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.543 "is_configured": false, 00:26:19.543 "data_offset": 0, 00:26:19.543 "data_size": 65536 00:26:19.543 }, 00:26:19.543 { 00:26:19.543 "name": "BaseBdev2", 00:26:19.543 "uuid": "5bf6919e-f862-5597-9c63-21a466cf0c94", 00:26:19.543 "is_configured": true, 00:26:19.543 "data_offset": 0, 00:26:19.543 "data_size": 65536 00:26:19.543 }, 00:26:19.543 { 00:26:19.543 "name": "BaseBdev3", 00:26:19.543 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:19.543 "is_configured": true, 00:26:19.543 "data_offset": 0, 00:26:19.543 "data_size": 65536 00:26:19.543 }, 00:26:19.543 { 00:26:19.543 "name": "BaseBdev4", 00:26:19.543 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:19.543 "is_configured": true, 00:26:19.543 "data_offset": 0, 00:26:19.543 "data_size": 65536 00:26:19.543 } 00:26:19.543 ] 00:26:19.543 }' 00:26:19.543 07:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.543 07:31:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:20.110 07:31:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:20.369 [2024-07-25 07:31:52.721273] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:20.369 07:31:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:20.369 [2024-07-25 07:31:52.780686] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa32e30 00:26:20.369 [2024-07-25 07:31:52.782865] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:20.369 [2024-07-25 07:31:52.894026] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:20.369 [2024-07-25 07:31:52.894307] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:20.628 [2024-07-25 07:31:53.030766] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:20.628 [2024-07-25 07:31:53.030923] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:20.886 [2024-07-25 07:31:53.288957] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:21.144 [2024-07-25 07:31:53.435398] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:21.403 [2024-07-25 07:31:53.710996] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.403 07:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.661 [2024-07-25 07:31:53.940959] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:21.661 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:21.661 "name": "raid_bdev1", 00:26:21.661 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:21.661 "strip_size_kb": 0, 00:26:21.661 "state": "online", 00:26:21.661 "raid_level": "raid1", 00:26:21.661 "superblock": false, 00:26:21.661 "num_base_bdevs": 4, 00:26:21.661 "num_base_bdevs_discovered": 4, 00:26:21.661 "num_base_bdevs_operational": 4, 00:26:21.661 "process": { 00:26:21.661 "type": "rebuild", 00:26:21.661 "target": "spare", 00:26:21.661 "progress": { 00:26:21.661 "blocks": 16384, 00:26:21.661 "percent": 25 00:26:21.661 } 00:26:21.661 }, 00:26:21.661 "base_bdevs_list": [ 00:26:21.661 { 00:26:21.661 "name": "spare", 00:26:21.661 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:21.661 "is_configured": true, 00:26:21.661 "data_offset": 0, 00:26:21.661 "data_size": 65536 00:26:21.661 }, 00:26:21.661 { 00:26:21.661 "name": "BaseBdev2", 00:26:21.661 "uuid": "5bf6919e-f862-5597-9c63-21a466cf0c94", 00:26:21.661 "is_configured": true, 00:26:21.661 "data_offset": 0, 00:26:21.661 "data_size": 65536 00:26:21.661 }, 00:26:21.661 { 00:26:21.661 "name": "BaseBdev3", 00:26:21.661 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:21.661 "is_configured": true, 00:26:21.661 "data_offset": 0, 00:26:21.661 "data_size": 65536 00:26:21.661 }, 00:26:21.661 { 00:26:21.661 "name": "BaseBdev4", 00:26:21.661 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:21.661 "is_configured": true, 00:26:21.661 "data_offset": 0, 00:26:21.661 "data_size": 65536 00:26:21.661 } 00:26:21.661 ] 00:26:21.661 }' 00:26:21.661 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:21.661 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:21.661 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:21.661 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:21.661 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:21.920 [2024-07-25 07:31:54.311515] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.920 [2024-07-25 07:31:54.402944] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:22.178 [2024-07-25 07:31:54.505327] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:22.178 [2024-07-25 07:31:54.523983] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:22.178 [2024-07-25 07:31:54.524009] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:22.178 [2024-07-25 07:31:54.524018] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:22.178 [2024-07-25 07:31:54.545008] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa2f430 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:22.178 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.179 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.179 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.179 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.179 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.179 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.437 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.437 "name": "raid_bdev1", 00:26:22.437 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:22.437 "strip_size_kb": 0, 00:26:22.437 "state": "online", 00:26:22.437 "raid_level": "raid1", 00:26:22.437 "superblock": false, 00:26:22.437 "num_base_bdevs": 4, 00:26:22.437 "num_base_bdevs_discovered": 3, 00:26:22.437 "num_base_bdevs_operational": 3, 00:26:22.437 "base_bdevs_list": [ 00:26:22.437 { 00:26:22.437 "name": null, 00:26:22.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.437 "is_configured": false, 00:26:22.437 "data_offset": 0, 00:26:22.437 "data_size": 65536 00:26:22.437 }, 00:26:22.437 { 00:26:22.437 "name": "BaseBdev2", 00:26:22.437 "uuid": "5bf6919e-f862-5597-9c63-21a466cf0c94", 00:26:22.437 "is_configured": true, 00:26:22.437 "data_offset": 0, 00:26:22.437 "data_size": 65536 00:26:22.437 }, 00:26:22.437 { 00:26:22.437 "name": "BaseBdev3", 00:26:22.437 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:22.437 "is_configured": true, 00:26:22.437 "data_offset": 0, 00:26:22.438 "data_size": 65536 00:26:22.438 }, 00:26:22.438 { 00:26:22.438 "name": "BaseBdev4", 00:26:22.438 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:22.438 "is_configured": true, 00:26:22.438 "data_offset": 0, 00:26:22.438 "data_size": 65536 00:26:22.438 } 00:26:22.438 ] 00:26:22.438 }' 00:26:22.438 07:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.438 07:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.005 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.263 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.263 "name": "raid_bdev1", 00:26:23.264 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:23.264 "strip_size_kb": 0, 00:26:23.264 "state": "online", 00:26:23.264 "raid_level": "raid1", 00:26:23.264 "superblock": false, 00:26:23.264 "num_base_bdevs": 4, 00:26:23.264 "num_base_bdevs_discovered": 3, 00:26:23.264 "num_base_bdevs_operational": 3, 00:26:23.264 "base_bdevs_list": [ 00:26:23.264 { 00:26:23.264 "name": null, 00:26:23.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.264 "is_configured": false, 00:26:23.264 "data_offset": 0, 00:26:23.264 "data_size": 65536 00:26:23.264 }, 00:26:23.264 { 00:26:23.264 "name": "BaseBdev2", 00:26:23.264 "uuid": "5bf6919e-f862-5597-9c63-21a466cf0c94", 00:26:23.264 "is_configured": true, 00:26:23.264 "data_offset": 0, 00:26:23.264 "data_size": 65536 00:26:23.264 }, 00:26:23.264 { 00:26:23.264 "name": "BaseBdev3", 00:26:23.264 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:23.264 "is_configured": true, 00:26:23.264 "data_offset": 0, 00:26:23.264 "data_size": 65536 00:26:23.264 }, 00:26:23.264 { 00:26:23.264 "name": "BaseBdev4", 00:26:23.264 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:23.264 "is_configured": true, 00:26:23.264 "data_offset": 0, 00:26:23.264 "data_size": 65536 00:26:23.264 } 00:26:23.264 ] 00:26:23.264 }' 00:26:23.264 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.264 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:23.264 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.264 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:23.264 07:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:23.522 [2024-07-25 07:31:55.987703] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:23.522 07:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:23.522 [2024-07-25 07:31:56.022417] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe7270 00:26:23.522 [2024-07-25 07:31:56.023821] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:23.780 [2024-07-25 07:31:56.150954] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:23.780 [2024-07-25 07:31:56.152184] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:24.038 [2024-07-25 07:31:56.390996] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:24.038 [2024-07-25 07:31:56.391560] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:24.327 [2024-07-25 07:31:56.792635] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:24.588 [2024-07-25 07:31:57.021376] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:24.588 [2024-07-25 07:31:57.021932] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.588 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.846 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.846 "name": "raid_bdev1", 00:26:24.846 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:24.846 "strip_size_kb": 0, 00:26:24.846 "state": "online", 00:26:24.846 "raid_level": "raid1", 00:26:24.846 "superblock": false, 00:26:24.846 "num_base_bdevs": 4, 00:26:24.846 "num_base_bdevs_discovered": 4, 00:26:24.847 "num_base_bdevs_operational": 4, 00:26:24.847 "process": { 00:26:24.847 "type": "rebuild", 00:26:24.847 "target": "spare", 00:26:24.847 "progress": { 00:26:24.847 "blocks": 10240, 00:26:24.847 "percent": 15 00:26:24.847 } 00:26:24.847 }, 00:26:24.847 "base_bdevs_list": [ 00:26:24.847 { 00:26:24.847 "name": "spare", 00:26:24.847 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:24.847 "is_configured": true, 00:26:24.847 "data_offset": 0, 00:26:24.847 "data_size": 65536 00:26:24.847 }, 00:26:24.847 { 00:26:24.847 "name": "BaseBdev2", 00:26:24.847 "uuid": "5bf6919e-f862-5597-9c63-21a466cf0c94", 00:26:24.847 "is_configured": true, 00:26:24.847 "data_offset": 0, 00:26:24.847 "data_size": 65536 00:26:24.847 }, 00:26:24.847 { 00:26:24.847 "name": "BaseBdev3", 00:26:24.847 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:24.847 "is_configured": true, 00:26:24.847 "data_offset": 0, 00:26:24.847 "data_size": 65536 00:26:24.847 }, 00:26:24.847 { 00:26:24.847 "name": "BaseBdev4", 00:26:24.847 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:24.847 "is_configured": true, 00:26:24.847 "data_offset": 0, 00:26:24.847 "data_size": 65536 00:26:24.847 } 00:26:24.847 ] 00:26:24.847 }' 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:26:24.847 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:25.105 [2024-07-25 07:31:57.527279] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:25.105 [2024-07-25 07:31:57.527839] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:25.105 [2024-07-25 07:31:57.558449] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:25.364 [2024-07-25 07:31:57.858023] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xa2f430 00:26:25.364 [2024-07-25 07:31:57.858057] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xbe7270 00:26:25.364 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:26:25.364 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:26:25.364 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:25.364 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.622 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:25.622 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:25.622 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.622 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.622 07:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.622 [2024-07-25 07:31:58.114438] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:25.622 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.622 "name": "raid_bdev1", 00:26:25.622 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:25.622 "strip_size_kb": 0, 00:26:25.622 "state": "online", 00:26:25.622 "raid_level": "raid1", 00:26:25.622 "superblock": false, 00:26:25.622 "num_base_bdevs": 4, 00:26:25.622 "num_base_bdevs_discovered": 3, 00:26:25.622 "num_base_bdevs_operational": 3, 00:26:25.622 "process": { 00:26:25.622 "type": "rebuild", 00:26:25.622 "target": "spare", 00:26:25.622 "progress": { 00:26:25.622 "blocks": 20480, 00:26:25.622 "percent": 31 00:26:25.622 } 00:26:25.622 }, 00:26:25.622 "base_bdevs_list": [ 00:26:25.622 { 00:26:25.622 "name": "spare", 00:26:25.622 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:25.622 "is_configured": true, 00:26:25.622 "data_offset": 0, 00:26:25.622 "data_size": 65536 00:26:25.622 }, 00:26:25.622 { 00:26:25.622 "name": null, 00:26:25.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.622 "is_configured": false, 00:26:25.622 "data_offset": 0, 00:26:25.622 "data_size": 65536 00:26:25.622 }, 00:26:25.622 { 00:26:25.622 "name": "BaseBdev3", 00:26:25.622 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:25.622 "is_configured": true, 00:26:25.622 "data_offset": 0, 00:26:25.622 "data_size": 65536 00:26:25.622 }, 00:26:25.622 { 00:26:25.622 "name": "BaseBdev4", 00:26:25.622 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:25.622 "is_configured": true, 00:26:25.622 "data_offset": 0, 00:26:25.622 "data_size": 65536 00:26:25.622 } 00:26:25.622 ] 00:26:25.622 }' 00:26:25.622 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=896 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.881 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.139 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:26.139 "name": "raid_bdev1", 00:26:26.139 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:26.139 "strip_size_kb": 0, 00:26:26.139 "state": "online", 00:26:26.139 "raid_level": "raid1", 00:26:26.139 "superblock": false, 00:26:26.139 "num_base_bdevs": 4, 00:26:26.139 "num_base_bdevs_discovered": 3, 00:26:26.140 "num_base_bdevs_operational": 3, 00:26:26.140 "process": { 00:26:26.140 "type": "rebuild", 00:26:26.140 "target": "spare", 00:26:26.140 "progress": { 00:26:26.140 "blocks": 24576, 00:26:26.140 "percent": 37 00:26:26.140 } 00:26:26.140 }, 00:26:26.140 "base_bdevs_list": [ 00:26:26.140 { 00:26:26.140 "name": "spare", 00:26:26.140 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:26.140 "is_configured": true, 00:26:26.140 "data_offset": 0, 00:26:26.140 "data_size": 65536 00:26:26.140 }, 00:26:26.140 { 00:26:26.140 "name": null, 00:26:26.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.140 "is_configured": false, 00:26:26.140 "data_offset": 0, 00:26:26.140 "data_size": 65536 00:26:26.140 }, 00:26:26.140 { 00:26:26.140 "name": "BaseBdev3", 00:26:26.140 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:26.140 "is_configured": true, 00:26:26.140 "data_offset": 0, 00:26:26.140 "data_size": 65536 00:26:26.140 }, 00:26:26.140 { 00:26:26.140 "name": "BaseBdev4", 00:26:26.140 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:26.140 "is_configured": true, 00:26:26.140 "data_offset": 0, 00:26:26.140 "data_size": 65536 00:26:26.140 } 00:26:26.140 ] 00:26:26.140 }' 00:26:26.140 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:26.140 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:26.140 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:26.140 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:26.140 07:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:26.398 [2024-07-25 07:31:58.826326] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:26.656 [2024-07-25 07:31:58.953326] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.222 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.481 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.481 "name": "raid_bdev1", 00:26:27.481 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:27.481 "strip_size_kb": 0, 00:26:27.481 "state": "online", 00:26:27.481 "raid_level": "raid1", 00:26:27.481 "superblock": false, 00:26:27.481 "num_base_bdevs": 4, 00:26:27.481 "num_base_bdevs_discovered": 3, 00:26:27.481 "num_base_bdevs_operational": 3, 00:26:27.481 "process": { 00:26:27.481 "type": "rebuild", 00:26:27.481 "target": "spare", 00:26:27.481 "progress": { 00:26:27.481 "blocks": 47104, 00:26:27.481 "percent": 71 00:26:27.481 } 00:26:27.481 }, 00:26:27.481 "base_bdevs_list": [ 00:26:27.481 { 00:26:27.481 "name": "spare", 00:26:27.481 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:27.481 "is_configured": true, 00:26:27.481 "data_offset": 0, 00:26:27.481 "data_size": 65536 00:26:27.481 }, 00:26:27.481 { 00:26:27.481 "name": null, 00:26:27.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.481 "is_configured": false, 00:26:27.481 "data_offset": 0, 00:26:27.481 "data_size": 65536 00:26:27.481 }, 00:26:27.481 { 00:26:27.481 "name": "BaseBdev3", 00:26:27.481 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:27.481 "is_configured": true, 00:26:27.481 "data_offset": 0, 00:26:27.481 "data_size": 65536 00:26:27.481 }, 00:26:27.481 { 00:26:27.481 "name": "BaseBdev4", 00:26:27.481 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:27.481 "is_configured": true, 00:26:27.481 "data_offset": 0, 00:26:27.481 "data_size": 65536 00:26:27.481 } 00:26:27.481 ] 00:26:27.481 }' 00:26:27.481 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.481 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:27.481 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.481 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:27.481 07:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:27.481 [2024-07-25 07:31:59.949616] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:27.739 [2024-07-25 07:32:00.068246] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:28.305 [2024-07-25 07:32:00.737363] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:28.563 [2024-07-25 07:32:00.844991] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:28.563 [2024-07-25 07:32:00.846940] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.563 07:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.563 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.563 "name": "raid_bdev1", 00:26:28.563 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:28.563 "strip_size_kb": 0, 00:26:28.563 "state": "online", 00:26:28.563 "raid_level": "raid1", 00:26:28.563 "superblock": false, 00:26:28.563 "num_base_bdevs": 4, 00:26:28.563 "num_base_bdevs_discovered": 3, 00:26:28.563 "num_base_bdevs_operational": 3, 00:26:28.563 "base_bdevs_list": [ 00:26:28.563 { 00:26:28.563 "name": "spare", 00:26:28.563 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:28.563 "is_configured": true, 00:26:28.563 "data_offset": 0, 00:26:28.563 "data_size": 65536 00:26:28.563 }, 00:26:28.563 { 00:26:28.563 "name": null, 00:26:28.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:28.563 "is_configured": false, 00:26:28.563 "data_offset": 0, 00:26:28.563 "data_size": 65536 00:26:28.563 }, 00:26:28.563 { 00:26:28.563 "name": "BaseBdev3", 00:26:28.563 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:28.563 "is_configured": true, 00:26:28.563 "data_offset": 0, 00:26:28.563 "data_size": 65536 00:26:28.563 }, 00:26:28.563 { 00:26:28.563 "name": "BaseBdev4", 00:26:28.563 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:28.563 "is_configured": true, 00:26:28.563 "data_offset": 0, 00:26:28.563 "data_size": 65536 00:26:28.563 } 00:26:28.563 ] 00:26:28.563 }' 00:26:28.563 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.820 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.077 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.077 "name": "raid_bdev1", 00:26:29.077 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:29.077 "strip_size_kb": 0, 00:26:29.077 "state": "online", 00:26:29.077 "raid_level": "raid1", 00:26:29.077 "superblock": false, 00:26:29.077 "num_base_bdevs": 4, 00:26:29.077 "num_base_bdevs_discovered": 3, 00:26:29.077 "num_base_bdevs_operational": 3, 00:26:29.077 "base_bdevs_list": [ 00:26:29.077 { 00:26:29.077 "name": "spare", 00:26:29.077 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:29.077 "is_configured": true, 00:26:29.077 "data_offset": 0, 00:26:29.077 "data_size": 65536 00:26:29.077 }, 00:26:29.077 { 00:26:29.077 "name": null, 00:26:29.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.077 "is_configured": false, 00:26:29.077 "data_offset": 0, 00:26:29.077 "data_size": 65536 00:26:29.077 }, 00:26:29.077 { 00:26:29.077 "name": "BaseBdev3", 00:26:29.077 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:29.077 "is_configured": true, 00:26:29.077 "data_offset": 0, 00:26:29.077 "data_size": 65536 00:26:29.077 }, 00:26:29.077 { 00:26:29.077 "name": "BaseBdev4", 00:26:29.077 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:29.077 "is_configured": true, 00:26:29.077 "data_offset": 0, 00:26:29.078 "data_size": 65536 00:26:29.078 } 00:26:29.078 ] 00:26:29.078 }' 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.078 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.336 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.336 "name": "raid_bdev1", 00:26:29.336 "uuid": "a198234b-f83a-4dab-bc60-fc65ceac2b13", 00:26:29.336 "strip_size_kb": 0, 00:26:29.336 "state": "online", 00:26:29.336 "raid_level": "raid1", 00:26:29.336 "superblock": false, 00:26:29.336 "num_base_bdevs": 4, 00:26:29.336 "num_base_bdevs_discovered": 3, 00:26:29.336 "num_base_bdevs_operational": 3, 00:26:29.336 "base_bdevs_list": [ 00:26:29.336 { 00:26:29.336 "name": "spare", 00:26:29.336 "uuid": "4e5d7a12-ae28-5f5c-8676-ac9dec3ca061", 00:26:29.336 "is_configured": true, 00:26:29.336 "data_offset": 0, 00:26:29.336 "data_size": 65536 00:26:29.336 }, 00:26:29.336 { 00:26:29.336 "name": null, 00:26:29.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.336 "is_configured": false, 00:26:29.336 "data_offset": 0, 00:26:29.336 "data_size": 65536 00:26:29.336 }, 00:26:29.336 { 00:26:29.336 "name": "BaseBdev3", 00:26:29.336 "uuid": "a4cc877e-0331-5bfc-8f64-eed5190cffbd", 00:26:29.336 "is_configured": true, 00:26:29.336 "data_offset": 0, 00:26:29.336 "data_size": 65536 00:26:29.336 }, 00:26:29.336 { 00:26:29.336 "name": "BaseBdev4", 00:26:29.336 "uuid": "a05876ec-3777-5635-bd80-fb161843d5d1", 00:26:29.336 "is_configured": true, 00:26:29.336 "data_offset": 0, 00:26:29.336 "data_size": 65536 00:26:29.336 } 00:26:29.336 ] 00:26:29.336 }' 00:26:29.336 07:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.336 07:32:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:29.902 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:30.160 [2024-07-25 07:32:02.543798] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:30.160 [2024-07-25 07:32:02.543827] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:30.160 00:26:30.160 Latency(us) 00:26:30.160 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:30.160 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:30.160 raid_bdev1 : 11.08 100.13 300.40 0.00 0.00 13647.16 275.25 120795.96 00:26:30.160 =================================================================================================================== 00:26:30.160 Total : 100.13 300.40 0.00 0.00 13647.16 275.25 120795.96 00:26:30.160 [2024-07-25 07:32:02.603593] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:30.160 [2024-07-25 07:32:02.603619] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:30.160 [2024-07-25 07:32:02.603703] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:30.160 [2024-07-25 07:32:02.603713] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad82f0 name raid_bdev1, state offline 00:26:30.160 0 00:26:30.160 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.160 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:30.418 07:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:30.676 /dev/nbd0 00:26:30.676 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:30.676 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:30.676 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:30.676 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:26:30.676 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:30.676 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:30.677 1+0 records in 00:26:30.677 1+0 records out 00:26:30.677 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280818 s, 14.6 MB/s 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@743 -- # continue 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:30.677 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:30.935 /dev/nbd1 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:30.935 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:30.936 1+0 records in 00:26:30.936 1+0 records out 00:26:30.936 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188045 s, 21.8 MB/s 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:30.936 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:26:31.193 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:31.194 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:31.452 /dev/nbd1 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:31.452 1+0 records in 00:26:31.452 1+0 records out 00:26:31.452 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276893 s, 14.8 MB/s 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:31.452 07:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:31.710 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:31.969 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1743780 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1743780 ']' 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1743780 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1743780 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1743780' 00:26:32.227 killing process with pid 1743780 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1743780 00:26:32.227 Received shutdown signal, test time was about 13.069539 seconds 00:26:32.227 00:26:32.227 Latency(us) 00:26:32.227 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:32.227 =================================================================================================================== 00:26:32.227 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:32.227 [2024-07-25 07:32:04.599088] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:32.227 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1743780 00:26:32.227 [2024-07-25 07:32:04.632924] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:26:32.486 00:26:32.486 real 0m17.988s 00:26:32.486 user 0m27.453s 00:26:32.486 sys 0m3.302s 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:32.486 ************************************ 00:26:32.486 END TEST raid_rebuild_test_io 00:26:32.486 ************************************ 00:26:32.486 07:32:04 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:26:32.486 07:32:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:32.486 07:32:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:32.486 07:32:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:32.486 ************************************ 00:26:32.486 START TEST raid_rebuild_test_sb_io 00:26:32.486 ************************************ 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true true true 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:26:32.486 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1747055 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1747055 /var/tmp/spdk-raid.sock 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1747055 ']' 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:32.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:32.487 07:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:32.487 [2024-07-25 07:32:04.980803] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:26:32.487 [2024-07-25 07:32:04.980859] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1747055 ] 00:26:32.487 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:32.487 Zero copy mechanism will not be used. 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.0 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.1 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.2 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.3 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.4 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.5 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.6 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:01.7 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.0 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.1 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.2 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.3 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.4 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.5 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.6 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3d:02.7 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.0 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.1 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.2 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.3 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.4 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.5 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.6 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:01.7 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.0 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.1 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.2 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.3 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.4 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.5 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.6 cannot be used 00:26:32.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:32.746 EAL: Requested device 0000:3f:02.7 cannot be used 00:26:32.746 [2024-07-25 07:32:05.112544] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:32.746 [2024-07-25 07:32:05.200308] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:26:32.746 [2024-07-25 07:32:05.258807] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:32.746 [2024-07-25 07:32:05.258843] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:33.681 07:32:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:33.681 07:32:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:26:33.681 07:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:33.681 07:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:33.681 BaseBdev1_malloc 00:26:33.681 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:33.940 [2024-07-25 07:32:06.303945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:33.940 [2024-07-25 07:32:06.303992] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.940 [2024-07-25 07:32:06.304012] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20de690 00:26:33.940 [2024-07-25 07:32:06.304023] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.940 [2024-07-25 07:32:06.305443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.940 [2024-07-25 07:32:06.305471] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:33.940 BaseBdev1 00:26:33.940 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:33.940 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:34.198 BaseBdev2_malloc 00:26:34.198 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:34.457 [2024-07-25 07:32:06.749385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:34.457 [2024-07-25 07:32:06.749426] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.457 [2024-07-25 07:32:06.749445] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20df050 00:26:34.457 [2024-07-25 07:32:06.749456] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.457 [2024-07-25 07:32:06.750789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.457 [2024-07-25 07:32:06.750817] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:34.457 BaseBdev2 00:26:34.457 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:34.457 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:34.457 BaseBdev3_malloc 00:26:34.716 07:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:34.716 [2024-07-25 07:32:07.194767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:34.716 [2024-07-25 07:32:07.194806] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.716 [2024-07-25 07:32:07.194824] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217d280 00:26:34.716 [2024-07-25 07:32:07.194835] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.716 [2024-07-25 07:32:07.196121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.716 [2024-07-25 07:32:07.196159] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:34.716 BaseBdev3 00:26:34.716 07:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:34.716 07:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:34.974 BaseBdev4_malloc 00:26:34.974 07:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:35.232 [2024-07-25 07:32:07.628077] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:35.232 [2024-07-25 07:32:07.628115] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:35.232 [2024-07-25 07:32:07.628133] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2180600 00:26:35.232 [2024-07-25 07:32:07.628148] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:35.232 [2024-07-25 07:32:07.629439] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:35.232 [2024-07-25 07:32:07.629470] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:35.232 BaseBdev4 00:26:35.232 07:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:35.490 spare_malloc 00:26:35.491 07:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:35.749 spare_delay 00:26:35.749 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:36.042 [2024-07-25 07:32:08.298156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:36.042 [2024-07-25 07:32:08.298197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.042 [2024-07-25 07:32:08.298216] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x218a4c0 00:26:36.042 [2024-07-25 07:32:08.298227] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.042 [2024-07-25 07:32:08.299613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.042 [2024-07-25 07:32:08.299641] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:36.042 spare 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:36.042 [2024-07-25 07:32:08.510730] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:36.042 [2024-07-25 07:32:08.511823] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:36.042 [2024-07-25 07:32:08.511872] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:36.042 [2024-07-25 07:32:08.511912] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:36.042 [2024-07-25 07:32:08.512102] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21812f0 00:26:36.042 [2024-07-25 07:32:08.512112] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:36.042 [2024-07-25 07:32:08.512293] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d57b0 00:26:36.042 [2024-07-25 07:32:08.512440] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21812f0 00:26:36.042 [2024-07-25 07:32:08.512449] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21812f0 00:26:36.042 [2024-07-25 07:32:08.512538] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:36.042 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.043 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.043 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.043 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.043 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.043 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.301 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:36.301 "name": "raid_bdev1", 00:26:36.301 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:36.301 "strip_size_kb": 0, 00:26:36.302 "state": "online", 00:26:36.302 "raid_level": "raid1", 00:26:36.302 "superblock": true, 00:26:36.302 "num_base_bdevs": 4, 00:26:36.302 "num_base_bdevs_discovered": 4, 00:26:36.302 "num_base_bdevs_operational": 4, 00:26:36.302 "base_bdevs_list": [ 00:26:36.302 { 00:26:36.302 "name": "BaseBdev1", 00:26:36.302 "uuid": "e2ef57b8-8bb1-5854-a813-7c76660b94d7", 00:26:36.302 "is_configured": true, 00:26:36.302 "data_offset": 2048, 00:26:36.302 "data_size": 63488 00:26:36.302 }, 00:26:36.302 { 00:26:36.302 "name": "BaseBdev2", 00:26:36.302 "uuid": "55a6effd-25e7-5c0f-875b-8661623062c9", 00:26:36.302 "is_configured": true, 00:26:36.302 "data_offset": 2048, 00:26:36.302 "data_size": 63488 00:26:36.302 }, 00:26:36.302 { 00:26:36.302 "name": "BaseBdev3", 00:26:36.302 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:36.302 "is_configured": true, 00:26:36.302 "data_offset": 2048, 00:26:36.302 "data_size": 63488 00:26:36.302 }, 00:26:36.302 { 00:26:36.302 "name": "BaseBdev4", 00:26:36.302 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:36.302 "is_configured": true, 00:26:36.302 "data_offset": 2048, 00:26:36.302 "data_size": 63488 00:26:36.302 } 00:26:36.302 ] 00:26:36.302 }' 00:26:36.302 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:36.302 07:32:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:36.869 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:36.869 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:37.128 [2024-07-25 07:32:09.521660] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:37.128 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:26:37.128 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.128 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:37.387 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:26:37.387 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:26:37.387 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:37.387 07:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:37.387 [2024-07-25 07:32:09.864263] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e0920 00:26:37.387 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:37.387 Zero copy mechanism will not be used. 00:26:37.387 Running I/O for 60 seconds... 00:26:37.646 [2024-07-25 07:32:09.972519] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:37.646 [2024-07-25 07:32:09.980080] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20e0920 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.646 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.905 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.905 "name": "raid_bdev1", 00:26:37.905 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:37.905 "strip_size_kb": 0, 00:26:37.905 "state": "online", 00:26:37.905 "raid_level": "raid1", 00:26:37.905 "superblock": true, 00:26:37.905 "num_base_bdevs": 4, 00:26:37.905 "num_base_bdevs_discovered": 3, 00:26:37.905 "num_base_bdevs_operational": 3, 00:26:37.905 "base_bdevs_list": [ 00:26:37.905 { 00:26:37.905 "name": null, 00:26:37.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.905 "is_configured": false, 00:26:37.905 "data_offset": 2048, 00:26:37.905 "data_size": 63488 00:26:37.905 }, 00:26:37.905 { 00:26:37.905 "name": "BaseBdev2", 00:26:37.905 "uuid": "55a6effd-25e7-5c0f-875b-8661623062c9", 00:26:37.905 "is_configured": true, 00:26:37.905 "data_offset": 2048, 00:26:37.905 "data_size": 63488 00:26:37.905 }, 00:26:37.905 { 00:26:37.905 "name": "BaseBdev3", 00:26:37.905 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:37.905 "is_configured": true, 00:26:37.905 "data_offset": 2048, 00:26:37.905 "data_size": 63488 00:26:37.905 }, 00:26:37.905 { 00:26:37.905 "name": "BaseBdev4", 00:26:37.905 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:37.905 "is_configured": true, 00:26:37.905 "data_offset": 2048, 00:26:37.905 "data_size": 63488 00:26:37.905 } 00:26:37.905 ] 00:26:37.905 }' 00:26:37.905 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.905 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:38.472 07:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:38.731 [2024-07-25 07:32:11.074516] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:38.731 07:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:38.731 [2024-07-25 07:32:11.133967] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2290270 00:26:38.731 [2024-07-25 07:32:11.136176] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:38.989 [2024-07-25 07:32:11.277398] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:38.989 [2024-07-25 07:32:11.413524] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:39.557 [2024-07-25 07:32:11.790949] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:39.557 [2024-07-25 07:32:11.920260] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:39.557 [2024-07-25 07:32:11.920442] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.815 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.815 [2024-07-25 07:32:12.259231] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:40.074 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.074 "name": "raid_bdev1", 00:26:40.074 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:40.074 "strip_size_kb": 0, 00:26:40.074 "state": "online", 00:26:40.074 "raid_level": "raid1", 00:26:40.074 "superblock": true, 00:26:40.074 "num_base_bdevs": 4, 00:26:40.074 "num_base_bdevs_discovered": 4, 00:26:40.074 "num_base_bdevs_operational": 4, 00:26:40.074 "process": { 00:26:40.074 "type": "rebuild", 00:26:40.074 "target": "spare", 00:26:40.074 "progress": { 00:26:40.074 "blocks": 14336, 00:26:40.074 "percent": 22 00:26:40.074 } 00:26:40.074 }, 00:26:40.074 "base_bdevs_list": [ 00:26:40.074 { 00:26:40.074 "name": "spare", 00:26:40.074 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:40.074 "is_configured": true, 00:26:40.074 "data_offset": 2048, 00:26:40.074 "data_size": 63488 00:26:40.074 }, 00:26:40.074 { 00:26:40.074 "name": "BaseBdev2", 00:26:40.074 "uuid": "55a6effd-25e7-5c0f-875b-8661623062c9", 00:26:40.074 "is_configured": true, 00:26:40.074 "data_offset": 2048, 00:26:40.074 "data_size": 63488 00:26:40.074 }, 00:26:40.074 { 00:26:40.074 "name": "BaseBdev3", 00:26:40.074 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:40.074 "is_configured": true, 00:26:40.074 "data_offset": 2048, 00:26:40.074 "data_size": 63488 00:26:40.074 }, 00:26:40.074 { 00:26:40.074 "name": "BaseBdev4", 00:26:40.074 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:40.074 "is_configured": true, 00:26:40.074 "data_offset": 2048, 00:26:40.074 "data_size": 63488 00:26:40.074 } 00:26:40.074 ] 00:26:40.074 }' 00:26:40.074 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.074 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:40.074 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:40.074 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:40.074 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:40.074 [2024-07-25 07:32:12.471330] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:40.074 [2024-07-25 07:32:12.471538] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:40.333 [2024-07-25 07:32:12.660501] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:40.333 [2024-07-25 07:32:12.801587] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:40.333 [2024-07-25 07:32:12.813849] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:40.333 [2024-07-25 07:32:12.813878] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:40.333 [2024-07-25 07:32:12.813888] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:40.333 [2024-07-25 07:32:12.819239] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20e0920 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.333 07:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.592 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:40.592 "name": "raid_bdev1", 00:26:40.592 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:40.592 "strip_size_kb": 0, 00:26:40.592 "state": "online", 00:26:40.592 "raid_level": "raid1", 00:26:40.592 "superblock": true, 00:26:40.592 "num_base_bdevs": 4, 00:26:40.592 "num_base_bdevs_discovered": 3, 00:26:40.592 "num_base_bdevs_operational": 3, 00:26:40.592 "base_bdevs_list": [ 00:26:40.592 { 00:26:40.592 "name": null, 00:26:40.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.592 "is_configured": false, 00:26:40.592 "data_offset": 2048, 00:26:40.592 "data_size": 63488 00:26:40.592 }, 00:26:40.592 { 00:26:40.592 "name": "BaseBdev2", 00:26:40.592 "uuid": "55a6effd-25e7-5c0f-875b-8661623062c9", 00:26:40.592 "is_configured": true, 00:26:40.592 "data_offset": 2048, 00:26:40.592 "data_size": 63488 00:26:40.592 }, 00:26:40.592 { 00:26:40.592 "name": "BaseBdev3", 00:26:40.592 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:40.592 "is_configured": true, 00:26:40.592 "data_offset": 2048, 00:26:40.592 "data_size": 63488 00:26:40.592 }, 00:26:40.592 { 00:26:40.592 "name": "BaseBdev4", 00:26:40.592 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:40.592 "is_configured": true, 00:26:40.592 "data_offset": 2048, 00:26:40.592 "data_size": 63488 00:26:40.592 } 00:26:40.592 ] 00:26:40.592 }' 00:26:40.592 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:40.592 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.529 "name": "raid_bdev1", 00:26:41.529 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:41.529 "strip_size_kb": 0, 00:26:41.529 "state": "online", 00:26:41.529 "raid_level": "raid1", 00:26:41.529 "superblock": true, 00:26:41.529 "num_base_bdevs": 4, 00:26:41.529 "num_base_bdevs_discovered": 3, 00:26:41.529 "num_base_bdevs_operational": 3, 00:26:41.529 "base_bdevs_list": [ 00:26:41.529 { 00:26:41.529 "name": null, 00:26:41.529 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.529 "is_configured": false, 00:26:41.529 "data_offset": 2048, 00:26:41.529 "data_size": 63488 00:26:41.529 }, 00:26:41.529 { 00:26:41.529 "name": "BaseBdev2", 00:26:41.529 "uuid": "55a6effd-25e7-5c0f-875b-8661623062c9", 00:26:41.529 "is_configured": true, 00:26:41.529 "data_offset": 2048, 00:26:41.529 "data_size": 63488 00:26:41.529 }, 00:26:41.529 { 00:26:41.529 "name": "BaseBdev3", 00:26:41.529 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:41.529 "is_configured": true, 00:26:41.529 "data_offset": 2048, 00:26:41.529 "data_size": 63488 00:26:41.529 }, 00:26:41.529 { 00:26:41.529 "name": "BaseBdev4", 00:26:41.529 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:41.529 "is_configured": true, 00:26:41.529 "data_offset": 2048, 00:26:41.529 "data_size": 63488 00:26:41.529 } 00:26:41.529 ] 00:26:41.529 }' 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:41.529 07:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.529 07:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:41.529 07:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:41.787 [2024-07-25 07:32:14.252276] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:41.787 07:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:42.046 [2024-07-25 07:32:14.344951] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x218a3c0 00:26:42.046 [2024-07-25 07:32:14.346362] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:42.046 [2024-07-25 07:32:14.474301] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:42.046 [2024-07-25 07:32:14.474580] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:42.304 [2024-07-25 07:32:14.678947] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:42.304 [2024-07-25 07:32:14.679108] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:42.563 [2024-07-25 07:32:14.960764] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:42.822 [2024-07-25 07:32:15.098609] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.822 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.081 [2024-07-25 07:32:15.484055] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:43.081 [2024-07-25 07:32:15.485244] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:43.081 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.081 "name": "raid_bdev1", 00:26:43.081 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:43.081 "strip_size_kb": 0, 00:26:43.081 "state": "online", 00:26:43.081 "raid_level": "raid1", 00:26:43.081 "superblock": true, 00:26:43.081 "num_base_bdevs": 4, 00:26:43.081 "num_base_bdevs_discovered": 4, 00:26:43.081 "num_base_bdevs_operational": 4, 00:26:43.081 "process": { 00:26:43.081 "type": "rebuild", 00:26:43.081 "target": "spare", 00:26:43.081 "progress": { 00:26:43.081 "blocks": 14336, 00:26:43.081 "percent": 22 00:26:43.081 } 00:26:43.081 }, 00:26:43.081 "base_bdevs_list": [ 00:26:43.081 { 00:26:43.081 "name": "spare", 00:26:43.081 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:43.081 "is_configured": true, 00:26:43.081 "data_offset": 2048, 00:26:43.081 "data_size": 63488 00:26:43.081 }, 00:26:43.081 { 00:26:43.081 "name": "BaseBdev2", 00:26:43.081 "uuid": "55a6effd-25e7-5c0f-875b-8661623062c9", 00:26:43.081 "is_configured": true, 00:26:43.081 "data_offset": 2048, 00:26:43.081 "data_size": 63488 00:26:43.081 }, 00:26:43.081 { 00:26:43.081 "name": "BaseBdev3", 00:26:43.081 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:43.081 "is_configured": true, 00:26:43.081 "data_offset": 2048, 00:26:43.081 "data_size": 63488 00:26:43.081 }, 00:26:43.081 { 00:26:43.081 "name": "BaseBdev4", 00:26:43.081 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:43.081 "is_configured": true, 00:26:43.081 "data_offset": 2048, 00:26:43.081 "data_size": 63488 00:26:43.081 } 00:26:43.081 ] 00:26:43.081 }' 00:26:43.081 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.081 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:43.081 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:26:43.340 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:26:43.340 07:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:43.340 [2024-07-25 07:32:15.716912] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:43.340 [2024-07-25 07:32:15.717104] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:43.340 [2024-07-25 07:32:15.858682] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:43.598 [2024-07-25 07:32:16.006681] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:43.857 [2024-07-25 07:32:16.207279] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x20e0920 00:26:43.857 [2024-07-25 07:32:16.207305] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x218a3c0 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.857 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:44.116 "name": "raid_bdev1", 00:26:44.116 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:44.116 "strip_size_kb": 0, 00:26:44.116 "state": "online", 00:26:44.116 "raid_level": "raid1", 00:26:44.116 "superblock": true, 00:26:44.116 "num_base_bdevs": 4, 00:26:44.116 "num_base_bdevs_discovered": 3, 00:26:44.116 "num_base_bdevs_operational": 3, 00:26:44.116 "process": { 00:26:44.116 "type": "rebuild", 00:26:44.116 "target": "spare", 00:26:44.116 "progress": { 00:26:44.116 "blocks": 22528, 00:26:44.116 "percent": 35 00:26:44.116 } 00:26:44.116 }, 00:26:44.116 "base_bdevs_list": [ 00:26:44.116 { 00:26:44.116 "name": "spare", 00:26:44.116 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:44.116 "is_configured": true, 00:26:44.116 "data_offset": 2048, 00:26:44.116 "data_size": 63488 00:26:44.116 }, 00:26:44.116 { 00:26:44.116 "name": null, 00:26:44.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.116 "is_configured": false, 00:26:44.116 "data_offset": 2048, 00:26:44.116 "data_size": 63488 00:26:44.116 }, 00:26:44.116 { 00:26:44.116 "name": "BaseBdev3", 00:26:44.116 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:44.116 "is_configured": true, 00:26:44.116 "data_offset": 2048, 00:26:44.116 "data_size": 63488 00:26:44.116 }, 00:26:44.116 { 00:26:44.116 "name": "BaseBdev4", 00:26:44.116 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:44.116 "is_configured": true, 00:26:44.116 "data_offset": 2048, 00:26:44.116 "data_size": 63488 00:26:44.116 } 00:26:44.116 ] 00:26:44.116 }' 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=914 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.116 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.374 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:44.374 "name": "raid_bdev1", 00:26:44.374 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:44.374 "strip_size_kb": 0, 00:26:44.374 "state": "online", 00:26:44.374 "raid_level": "raid1", 00:26:44.374 "superblock": true, 00:26:44.374 "num_base_bdevs": 4, 00:26:44.374 "num_base_bdevs_discovered": 3, 00:26:44.374 "num_base_bdevs_operational": 3, 00:26:44.374 "process": { 00:26:44.374 "type": "rebuild", 00:26:44.374 "target": "spare", 00:26:44.374 "progress": { 00:26:44.374 "blocks": 26624, 00:26:44.374 "percent": 41 00:26:44.374 } 00:26:44.374 }, 00:26:44.374 "base_bdevs_list": [ 00:26:44.374 { 00:26:44.374 "name": "spare", 00:26:44.374 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:44.374 "is_configured": true, 00:26:44.374 "data_offset": 2048, 00:26:44.374 "data_size": 63488 00:26:44.374 }, 00:26:44.374 { 00:26:44.374 "name": null, 00:26:44.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.374 "is_configured": false, 00:26:44.375 "data_offset": 2048, 00:26:44.375 "data_size": 63488 00:26:44.375 }, 00:26:44.375 { 00:26:44.375 "name": "BaseBdev3", 00:26:44.375 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:44.375 "is_configured": true, 00:26:44.375 "data_offset": 2048, 00:26:44.375 "data_size": 63488 00:26:44.375 }, 00:26:44.375 { 00:26:44.375 "name": "BaseBdev4", 00:26:44.375 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:44.375 "is_configured": true, 00:26:44.375 "data_offset": 2048, 00:26:44.375 "data_size": 63488 00:26:44.375 } 00:26:44.375 ] 00:26:44.375 }' 00:26:44.375 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:44.375 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:44.375 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:44.375 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:44.375 07:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:44.633 [2024-07-25 07:32:17.018919] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:44.892 [2024-07-25 07:32:17.238122] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:45.150 [2024-07-25 07:32:17.534410] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:45.150 [2024-07-25 07:32:17.661916] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.409 07:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.667 07:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.667 "name": "raid_bdev1", 00:26:45.667 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:45.667 "strip_size_kb": 0, 00:26:45.667 "state": "online", 00:26:45.667 "raid_level": "raid1", 00:26:45.667 "superblock": true, 00:26:45.667 "num_base_bdevs": 4, 00:26:45.667 "num_base_bdevs_discovered": 3, 00:26:45.667 "num_base_bdevs_operational": 3, 00:26:45.667 "process": { 00:26:45.667 "type": "rebuild", 00:26:45.667 "target": "spare", 00:26:45.667 "progress": { 00:26:45.667 "blocks": 45056, 00:26:45.667 "percent": 70 00:26:45.667 } 00:26:45.667 }, 00:26:45.667 "base_bdevs_list": [ 00:26:45.667 { 00:26:45.667 "name": "spare", 00:26:45.667 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:45.667 "is_configured": true, 00:26:45.667 "data_offset": 2048, 00:26:45.667 "data_size": 63488 00:26:45.667 }, 00:26:45.667 { 00:26:45.667 "name": null, 00:26:45.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.667 "is_configured": false, 00:26:45.667 "data_offset": 2048, 00:26:45.667 "data_size": 63488 00:26:45.667 }, 00:26:45.667 { 00:26:45.667 "name": "BaseBdev3", 00:26:45.667 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:45.667 "is_configured": true, 00:26:45.667 "data_offset": 2048, 00:26:45.667 "data_size": 63488 00:26:45.667 }, 00:26:45.667 { 00:26:45.667 "name": "BaseBdev4", 00:26:45.667 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:45.667 "is_configured": true, 00:26:45.667 "data_offset": 2048, 00:26:45.667 "data_size": 63488 00:26:45.667 } 00:26:45.667 ] 00:26:45.667 }' 00:26:45.667 07:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.667 [2024-07-25 07:32:18.117307] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:45.667 07:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:45.668 07:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.668 07:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:45.668 07:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:45.926 [2024-07-25 07:32:18.346274] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:45.926 [2024-07-25 07:32:18.454827] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:45.926 [2024-07-25 07:32:18.455083] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:46.493 [2024-07-25 07:32:18.793342] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:46.493 [2024-07-25 07:32:18.895442] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:46.493 [2024-07-25 07:32:18.895593] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:46.752 [2024-07-25 07:32:19.123592] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.752 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.752 [2024-07-25 07:32:19.223854] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:46.752 [2024-07-25 07:32:19.225857] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.011 "name": "raid_bdev1", 00:26:47.011 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:47.011 "strip_size_kb": 0, 00:26:47.011 "state": "online", 00:26:47.011 "raid_level": "raid1", 00:26:47.011 "superblock": true, 00:26:47.011 "num_base_bdevs": 4, 00:26:47.011 "num_base_bdevs_discovered": 3, 00:26:47.011 "num_base_bdevs_operational": 3, 00:26:47.011 "base_bdevs_list": [ 00:26:47.011 { 00:26:47.011 "name": "spare", 00:26:47.011 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:47.011 "is_configured": true, 00:26:47.011 "data_offset": 2048, 00:26:47.011 "data_size": 63488 00:26:47.011 }, 00:26:47.011 { 00:26:47.011 "name": null, 00:26:47.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.011 "is_configured": false, 00:26:47.011 "data_offset": 2048, 00:26:47.011 "data_size": 63488 00:26:47.011 }, 00:26:47.011 { 00:26:47.011 "name": "BaseBdev3", 00:26:47.011 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:47.011 "is_configured": true, 00:26:47.011 "data_offset": 2048, 00:26:47.011 "data_size": 63488 00:26:47.011 }, 00:26:47.011 { 00:26:47.011 "name": "BaseBdev4", 00:26:47.011 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:47.011 "is_configured": true, 00:26:47.011 "data_offset": 2048, 00:26:47.011 "data_size": 63488 00:26:47.011 } 00:26:47.011 ] 00:26:47.011 }' 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.011 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.272 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.272 "name": "raid_bdev1", 00:26:47.272 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:47.272 "strip_size_kb": 0, 00:26:47.272 "state": "online", 00:26:47.272 "raid_level": "raid1", 00:26:47.272 "superblock": true, 00:26:47.272 "num_base_bdevs": 4, 00:26:47.272 "num_base_bdevs_discovered": 3, 00:26:47.272 "num_base_bdevs_operational": 3, 00:26:47.272 "base_bdevs_list": [ 00:26:47.272 { 00:26:47.272 "name": "spare", 00:26:47.272 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:47.272 "is_configured": true, 00:26:47.272 "data_offset": 2048, 00:26:47.272 "data_size": 63488 00:26:47.272 }, 00:26:47.272 { 00:26:47.272 "name": null, 00:26:47.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.272 "is_configured": false, 00:26:47.272 "data_offset": 2048, 00:26:47.272 "data_size": 63488 00:26:47.272 }, 00:26:47.272 { 00:26:47.272 "name": "BaseBdev3", 00:26:47.272 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:47.272 "is_configured": true, 00:26:47.272 "data_offset": 2048, 00:26:47.272 "data_size": 63488 00:26:47.272 }, 00:26:47.272 { 00:26:47.272 "name": "BaseBdev4", 00:26:47.272 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:47.272 "is_configured": true, 00:26:47.272 "data_offset": 2048, 00:26:47.272 "data_size": 63488 00:26:47.272 } 00:26:47.272 ] 00:26:47.272 }' 00:26:47.272 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.534 07:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.793 07:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:47.793 "name": "raid_bdev1", 00:26:47.793 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:47.793 "strip_size_kb": 0, 00:26:47.793 "state": "online", 00:26:47.793 "raid_level": "raid1", 00:26:47.793 "superblock": true, 00:26:47.793 "num_base_bdevs": 4, 00:26:47.793 "num_base_bdevs_discovered": 3, 00:26:47.793 "num_base_bdevs_operational": 3, 00:26:47.793 "base_bdevs_list": [ 00:26:47.793 { 00:26:47.793 "name": "spare", 00:26:47.793 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:47.793 "is_configured": true, 00:26:47.793 "data_offset": 2048, 00:26:47.793 "data_size": 63488 00:26:47.793 }, 00:26:47.793 { 00:26:47.793 "name": null, 00:26:47.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.793 "is_configured": false, 00:26:47.793 "data_offset": 2048, 00:26:47.793 "data_size": 63488 00:26:47.793 }, 00:26:47.793 { 00:26:47.793 "name": "BaseBdev3", 00:26:47.793 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:47.793 "is_configured": true, 00:26:47.793 "data_offset": 2048, 00:26:47.793 "data_size": 63488 00:26:47.793 }, 00:26:47.793 { 00:26:47.793 "name": "BaseBdev4", 00:26:47.793 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:47.793 "is_configured": true, 00:26:47.793 "data_offset": 2048, 00:26:47.793 "data_size": 63488 00:26:47.793 } 00:26:47.793 ] 00:26:47.793 }' 00:26:47.793 07:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:47.793 07:32:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:48.360 07:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:48.360 [2024-07-25 07:32:20.837296] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:48.360 [2024-07-25 07:32:20.837324] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:48.360 00:26:48.360 Latency(us) 00:26:48.360 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:48.360 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:48.360 raid_bdev1 : 10.98 106.41 319.24 0.00 0.00 12435.09 268.70 119118.23 00:26:48.360 =================================================================================================================== 00:26:48.360 Total : 106.41 319.24 0.00 0.00 12435.09 268.70 119118.23 00:26:48.360 [2024-07-25 07:32:20.873010] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:48.360 [2024-07-25 07:32:20.873036] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:48.360 [2024-07-25 07:32:20.873123] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:48.360 [2024-07-25 07:32:20.873133] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21812f0 name raid_bdev1, state offline 00:26:48.360 0 00:26:48.618 07:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.618 07:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.618 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:48.877 /dev/nbd0 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:48.877 1+0 records in 00:26:48.877 1+0 records out 00:26:48.877 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253547 s, 16.2 MB/s 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@743 -- # continue 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.877 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:49.136 /dev/nbd1 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:49.136 1+0 records in 00:26:49.136 1+0 records out 00:26:49.136 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002771 s, 14.8 MB/s 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:49.136 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:49.394 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:49.652 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:49.652 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:49.652 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:49.652 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:49.652 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:49.653 07:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:49.911 /dev/nbd1 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:49.911 1+0 records in 00:26:49.911 1+0 records out 00:26:49.911 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189789 s, 21.6 MB/s 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:49.911 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:50.170 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:26:50.428 07:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:50.687 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:50.946 [2024-07-25 07:32:23.269126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:50.946 [2024-07-25 07:32:23.269171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.946 [2024-07-25 07:32:23.269189] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d90b0 00:26:50.946 [2024-07-25 07:32:23.269200] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.946 [2024-07-25 07:32:23.270772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.946 [2024-07-25 07:32:23.270798] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:50.946 [2024-07-25 07:32:23.270866] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:50.946 [2024-07-25 07:32:23.270890] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:50.946 [2024-07-25 07:32:23.270983] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:50.946 [2024-07-25 07:32:23.271052] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:50.946 spare 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.946 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.946 [2024-07-25 07:32:23.371366] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20e0600 00:26:50.946 [2024-07-25 07:32:23.371381] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:50.946 [2024-07-25 07:32:23.371541] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21818d0 00:26:50.946 [2024-07-25 07:32:23.371670] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20e0600 00:26:50.946 [2024-07-25 07:32:23.371680] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20e0600 00:26:50.946 [2024-07-25 07:32:23.371770] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:51.204 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:51.204 "name": "raid_bdev1", 00:26:51.204 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:51.204 "strip_size_kb": 0, 00:26:51.204 "state": "online", 00:26:51.204 "raid_level": "raid1", 00:26:51.204 "superblock": true, 00:26:51.204 "num_base_bdevs": 4, 00:26:51.204 "num_base_bdevs_discovered": 3, 00:26:51.204 "num_base_bdevs_operational": 3, 00:26:51.204 "base_bdevs_list": [ 00:26:51.204 { 00:26:51.204 "name": "spare", 00:26:51.204 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:51.204 "is_configured": true, 00:26:51.204 "data_offset": 2048, 00:26:51.204 "data_size": 63488 00:26:51.204 }, 00:26:51.204 { 00:26:51.204 "name": null, 00:26:51.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.204 "is_configured": false, 00:26:51.204 "data_offset": 2048, 00:26:51.204 "data_size": 63488 00:26:51.204 }, 00:26:51.204 { 00:26:51.204 "name": "BaseBdev3", 00:26:51.204 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:51.204 "is_configured": true, 00:26:51.204 "data_offset": 2048, 00:26:51.204 "data_size": 63488 00:26:51.204 }, 00:26:51.204 { 00:26:51.204 "name": "BaseBdev4", 00:26:51.204 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:51.204 "is_configured": true, 00:26:51.204 "data_offset": 2048, 00:26:51.204 "data_size": 63488 00:26:51.204 } 00:26:51.204 ] 00:26:51.204 }' 00:26:51.204 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:51.204 07:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:51.769 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:51.769 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.769 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:51.769 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:51.769 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.770 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.770 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:52.028 "name": "raid_bdev1", 00:26:52.028 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:52.028 "strip_size_kb": 0, 00:26:52.028 "state": "online", 00:26:52.028 "raid_level": "raid1", 00:26:52.028 "superblock": true, 00:26:52.028 "num_base_bdevs": 4, 00:26:52.028 "num_base_bdevs_discovered": 3, 00:26:52.028 "num_base_bdevs_operational": 3, 00:26:52.028 "base_bdevs_list": [ 00:26:52.028 { 00:26:52.028 "name": "spare", 00:26:52.028 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:52.028 "is_configured": true, 00:26:52.028 "data_offset": 2048, 00:26:52.028 "data_size": 63488 00:26:52.028 }, 00:26:52.028 { 00:26:52.028 "name": null, 00:26:52.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.028 "is_configured": false, 00:26:52.028 "data_offset": 2048, 00:26:52.028 "data_size": 63488 00:26:52.028 }, 00:26:52.028 { 00:26:52.028 "name": "BaseBdev3", 00:26:52.028 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:52.028 "is_configured": true, 00:26:52.028 "data_offset": 2048, 00:26:52.028 "data_size": 63488 00:26:52.028 }, 00:26:52.028 { 00:26:52.028 "name": "BaseBdev4", 00:26:52.028 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:52.028 "is_configured": true, 00:26:52.028 "data_offset": 2048, 00:26:52.028 "data_size": 63488 00:26:52.028 } 00:26:52.028 ] 00:26:52.028 }' 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.028 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:52.287 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:26:52.287 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:52.545 [2024-07-25 07:32:24.825557] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.545 07:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.545 07:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:52.545 "name": "raid_bdev1", 00:26:52.545 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:52.545 "strip_size_kb": 0, 00:26:52.545 "state": "online", 00:26:52.545 "raid_level": "raid1", 00:26:52.545 "superblock": true, 00:26:52.545 "num_base_bdevs": 4, 00:26:52.545 "num_base_bdevs_discovered": 2, 00:26:52.545 "num_base_bdevs_operational": 2, 00:26:52.545 "base_bdevs_list": [ 00:26:52.545 { 00:26:52.545 "name": null, 00:26:52.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.545 "is_configured": false, 00:26:52.545 "data_offset": 2048, 00:26:52.546 "data_size": 63488 00:26:52.546 }, 00:26:52.546 { 00:26:52.546 "name": null, 00:26:52.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.546 "is_configured": false, 00:26:52.546 "data_offset": 2048, 00:26:52.546 "data_size": 63488 00:26:52.546 }, 00:26:52.546 { 00:26:52.546 "name": "BaseBdev3", 00:26:52.546 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:52.546 "is_configured": true, 00:26:52.546 "data_offset": 2048, 00:26:52.546 "data_size": 63488 00:26:52.546 }, 00:26:52.546 { 00:26:52.546 "name": "BaseBdev4", 00:26:52.546 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:52.546 "is_configured": true, 00:26:52.546 "data_offset": 2048, 00:26:52.546 "data_size": 63488 00:26:52.546 } 00:26:52.546 ] 00:26:52.546 }' 00:26:52.546 07:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:52.546 07:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:53.112 07:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:53.370 [2024-07-25 07:32:25.784215] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:53.370 [2024-07-25 07:32:25.784348] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:53.370 [2024-07-25 07:32:25.784363] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:53.370 [2024-07-25 07:32:25.784389] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:53.370 [2024-07-25 07:32:25.788590] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d98b0 00:26:53.370 [2024-07-25 07:32:25.790664] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:53.370 07:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.304 07:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.562 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:54.562 "name": "raid_bdev1", 00:26:54.562 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:54.562 "strip_size_kb": 0, 00:26:54.562 "state": "online", 00:26:54.562 "raid_level": "raid1", 00:26:54.562 "superblock": true, 00:26:54.562 "num_base_bdevs": 4, 00:26:54.562 "num_base_bdevs_discovered": 3, 00:26:54.562 "num_base_bdevs_operational": 3, 00:26:54.562 "process": { 00:26:54.562 "type": "rebuild", 00:26:54.562 "target": "spare", 00:26:54.562 "progress": { 00:26:54.562 "blocks": 24576, 00:26:54.562 "percent": 38 00:26:54.562 } 00:26:54.562 }, 00:26:54.562 "base_bdevs_list": [ 00:26:54.562 { 00:26:54.562 "name": "spare", 00:26:54.562 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:54.562 "is_configured": true, 00:26:54.562 "data_offset": 2048, 00:26:54.562 "data_size": 63488 00:26:54.562 }, 00:26:54.562 { 00:26:54.562 "name": null, 00:26:54.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.562 "is_configured": false, 00:26:54.563 "data_offset": 2048, 00:26:54.563 "data_size": 63488 00:26:54.563 }, 00:26:54.563 { 00:26:54.563 "name": "BaseBdev3", 00:26:54.563 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:54.563 "is_configured": true, 00:26:54.563 "data_offset": 2048, 00:26:54.563 "data_size": 63488 00:26:54.563 }, 00:26:54.563 { 00:26:54.563 "name": "BaseBdev4", 00:26:54.563 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:54.563 "is_configured": true, 00:26:54.563 "data_offset": 2048, 00:26:54.563 "data_size": 63488 00:26:54.563 } 00:26:54.563 ] 00:26:54.563 }' 00:26:54.563 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:54.563 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:54.563 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:54.821 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:54.821 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:54.821 [2024-07-25 07:32:27.345880] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:55.079 [2024-07-25 07:32:27.402414] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:55.079 [2024-07-25 07:32:27.402454] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:55.079 [2024-07-25 07:32:27.402469] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:55.079 [2024-07-25 07:32:27.402476] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.079 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.337 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.337 "name": "raid_bdev1", 00:26:55.337 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:55.337 "strip_size_kb": 0, 00:26:55.337 "state": "online", 00:26:55.337 "raid_level": "raid1", 00:26:55.337 "superblock": true, 00:26:55.337 "num_base_bdevs": 4, 00:26:55.337 "num_base_bdevs_discovered": 2, 00:26:55.337 "num_base_bdevs_operational": 2, 00:26:55.337 "base_bdevs_list": [ 00:26:55.337 { 00:26:55.337 "name": null, 00:26:55.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.337 "is_configured": false, 00:26:55.337 "data_offset": 2048, 00:26:55.337 "data_size": 63488 00:26:55.337 }, 00:26:55.337 { 00:26:55.337 "name": null, 00:26:55.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.337 "is_configured": false, 00:26:55.337 "data_offset": 2048, 00:26:55.337 "data_size": 63488 00:26:55.337 }, 00:26:55.337 { 00:26:55.337 "name": "BaseBdev3", 00:26:55.337 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:55.337 "is_configured": true, 00:26:55.337 "data_offset": 2048, 00:26:55.337 "data_size": 63488 00:26:55.337 }, 00:26:55.337 { 00:26:55.337 "name": "BaseBdev4", 00:26:55.337 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:55.337 "is_configured": true, 00:26:55.337 "data_offset": 2048, 00:26:55.337 "data_size": 63488 00:26:55.337 } 00:26:55.337 ] 00:26:55.337 }' 00:26:55.337 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.337 07:32:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:55.903 07:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:55.903 [2024-07-25 07:32:28.425329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:55.903 [2024-07-25 07:32:28.425373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:55.903 [2024-07-25 07:32:28.425393] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x228f2b0 00:26:55.903 [2024-07-25 07:32:28.425405] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:55.903 [2024-07-25 07:32:28.425735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:55.903 [2024-07-25 07:32:28.425756] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:55.903 [2024-07-25 07:32:28.425826] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:55.903 [2024-07-25 07:32:28.425837] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:55.903 [2024-07-25 07:32:28.425847] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:55.903 [2024-07-25 07:32:28.425864] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.903 [2024-07-25 07:32:28.430082] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21818d0 00:26:55.903 spare 00:26:55.903 [2024-07-25 07:32:28.431460] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:56.161 07:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.097 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.356 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.356 "name": "raid_bdev1", 00:26:57.356 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:57.356 "strip_size_kb": 0, 00:26:57.356 "state": "online", 00:26:57.356 "raid_level": "raid1", 00:26:57.356 "superblock": true, 00:26:57.356 "num_base_bdevs": 4, 00:26:57.356 "num_base_bdevs_discovered": 3, 00:26:57.356 "num_base_bdevs_operational": 3, 00:26:57.356 "process": { 00:26:57.356 "type": "rebuild", 00:26:57.356 "target": "spare", 00:26:57.356 "progress": { 00:26:57.356 "blocks": 24576, 00:26:57.356 "percent": 38 00:26:57.356 } 00:26:57.356 }, 00:26:57.356 "base_bdevs_list": [ 00:26:57.356 { 00:26:57.356 "name": "spare", 00:26:57.356 "uuid": "9aa59cf9-3b40-5962-9fff-c7a8f8ebc947", 00:26:57.356 "is_configured": true, 00:26:57.356 "data_offset": 2048, 00:26:57.356 "data_size": 63488 00:26:57.356 }, 00:26:57.356 { 00:26:57.356 "name": null, 00:26:57.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.356 "is_configured": false, 00:26:57.356 "data_offset": 2048, 00:26:57.356 "data_size": 63488 00:26:57.356 }, 00:26:57.356 { 00:26:57.356 "name": "BaseBdev3", 00:26:57.356 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:57.356 "is_configured": true, 00:26:57.356 "data_offset": 2048, 00:26:57.356 "data_size": 63488 00:26:57.356 }, 00:26:57.356 { 00:26:57.356 "name": "BaseBdev4", 00:26:57.356 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:57.356 "is_configured": true, 00:26:57.356 "data_offset": 2048, 00:26:57.356 "data_size": 63488 00:26:57.356 } 00:26:57.356 ] 00:26:57.356 }' 00:26:57.356 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.356 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:57.356 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.356 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:57.356 07:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:57.615 [2024-07-25 07:32:29.979105] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.615 [2024-07-25 07:32:30.043237] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:57.615 [2024-07-25 07:32:30.043282] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.615 [2024-07-25 07:32:30.043296] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.615 [2024-07-25 07:32:30.043304] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.615 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.874 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.874 "name": "raid_bdev1", 00:26:57.874 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:57.874 "strip_size_kb": 0, 00:26:57.874 "state": "online", 00:26:57.874 "raid_level": "raid1", 00:26:57.874 "superblock": true, 00:26:57.874 "num_base_bdevs": 4, 00:26:57.874 "num_base_bdevs_discovered": 2, 00:26:57.874 "num_base_bdevs_operational": 2, 00:26:57.874 "base_bdevs_list": [ 00:26:57.874 { 00:26:57.874 "name": null, 00:26:57.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.874 "is_configured": false, 00:26:57.874 "data_offset": 2048, 00:26:57.874 "data_size": 63488 00:26:57.874 }, 00:26:57.874 { 00:26:57.874 "name": null, 00:26:57.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.874 "is_configured": false, 00:26:57.874 "data_offset": 2048, 00:26:57.874 "data_size": 63488 00:26:57.874 }, 00:26:57.874 { 00:26:57.874 "name": "BaseBdev3", 00:26:57.874 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:57.874 "is_configured": true, 00:26:57.874 "data_offset": 2048, 00:26:57.874 "data_size": 63488 00:26:57.874 }, 00:26:57.874 { 00:26:57.874 "name": "BaseBdev4", 00:26:57.874 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:57.874 "is_configured": true, 00:26:57.874 "data_offset": 2048, 00:26:57.874 "data_size": 63488 00:26:57.874 } 00:26:57.874 ] 00:26:57.874 }' 00:26:57.874 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.874 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.441 07:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.700 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:58.700 "name": "raid_bdev1", 00:26:58.700 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:26:58.700 "strip_size_kb": 0, 00:26:58.700 "state": "online", 00:26:58.700 "raid_level": "raid1", 00:26:58.700 "superblock": true, 00:26:58.700 "num_base_bdevs": 4, 00:26:58.700 "num_base_bdevs_discovered": 2, 00:26:58.700 "num_base_bdevs_operational": 2, 00:26:58.700 "base_bdevs_list": [ 00:26:58.700 { 00:26:58.700 "name": null, 00:26:58.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.700 "is_configured": false, 00:26:58.700 "data_offset": 2048, 00:26:58.700 "data_size": 63488 00:26:58.700 }, 00:26:58.700 { 00:26:58.700 "name": null, 00:26:58.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.700 "is_configured": false, 00:26:58.700 "data_offset": 2048, 00:26:58.700 "data_size": 63488 00:26:58.700 }, 00:26:58.700 { 00:26:58.700 "name": "BaseBdev3", 00:26:58.700 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:26:58.700 "is_configured": true, 00:26:58.700 "data_offset": 2048, 00:26:58.700 "data_size": 63488 00:26:58.700 }, 00:26:58.700 { 00:26:58.700 "name": "BaseBdev4", 00:26:58.700 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:26:58.700 "is_configured": true, 00:26:58.700 "data_offset": 2048, 00:26:58.700 "data_size": 63488 00:26:58.700 } 00:26:58.700 ] 00:26:58.700 }' 00:26:58.700 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:58.700 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:58.700 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:58.700 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:58.700 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:58.959 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:59.237 [2024-07-25 07:32:31.631791] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:59.237 [2024-07-25 07:32:31.631832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:59.237 [2024-07-25 07:32:31.631849] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e0a70 00:26:59.237 [2024-07-25 07:32:31.631860] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:59.237 [2024-07-25 07:32:31.632174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:59.238 [2024-07-25 07:32:31.632192] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:59.238 [2024-07-25 07:32:31.632255] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:59.238 [2024-07-25 07:32:31.632273] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:59.238 [2024-07-25 07:32:31.632288] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:59.238 BaseBdev1 00:26:59.238 07:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.189 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.447 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.447 "name": "raid_bdev1", 00:27:00.447 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:27:00.447 "strip_size_kb": 0, 00:27:00.447 "state": "online", 00:27:00.447 "raid_level": "raid1", 00:27:00.447 "superblock": true, 00:27:00.447 "num_base_bdevs": 4, 00:27:00.447 "num_base_bdevs_discovered": 2, 00:27:00.447 "num_base_bdevs_operational": 2, 00:27:00.447 "base_bdevs_list": [ 00:27:00.447 { 00:27:00.447 "name": null, 00:27:00.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.447 "is_configured": false, 00:27:00.447 "data_offset": 2048, 00:27:00.447 "data_size": 63488 00:27:00.447 }, 00:27:00.447 { 00:27:00.447 "name": null, 00:27:00.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.447 "is_configured": false, 00:27:00.447 "data_offset": 2048, 00:27:00.448 "data_size": 63488 00:27:00.448 }, 00:27:00.448 { 00:27:00.448 "name": "BaseBdev3", 00:27:00.448 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:27:00.448 "is_configured": true, 00:27:00.448 "data_offset": 2048, 00:27:00.448 "data_size": 63488 00:27:00.448 }, 00:27:00.448 { 00:27:00.448 "name": "BaseBdev4", 00:27:00.448 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:27:00.448 "is_configured": true, 00:27:00.448 "data_offset": 2048, 00:27:00.448 "data_size": 63488 00:27:00.448 } 00:27:00.448 ] 00:27:00.448 }' 00:27:00.448 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.448 07:32:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.013 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.271 "name": "raid_bdev1", 00:27:01.271 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:27:01.271 "strip_size_kb": 0, 00:27:01.271 "state": "online", 00:27:01.271 "raid_level": "raid1", 00:27:01.271 "superblock": true, 00:27:01.271 "num_base_bdevs": 4, 00:27:01.271 "num_base_bdevs_discovered": 2, 00:27:01.271 "num_base_bdevs_operational": 2, 00:27:01.271 "base_bdevs_list": [ 00:27:01.271 { 00:27:01.271 "name": null, 00:27:01.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.271 "is_configured": false, 00:27:01.271 "data_offset": 2048, 00:27:01.271 "data_size": 63488 00:27:01.271 }, 00:27:01.271 { 00:27:01.271 "name": null, 00:27:01.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.271 "is_configured": false, 00:27:01.271 "data_offset": 2048, 00:27:01.271 "data_size": 63488 00:27:01.271 }, 00:27:01.271 { 00:27:01.271 "name": "BaseBdev3", 00:27:01.271 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:27:01.271 "is_configured": true, 00:27:01.271 "data_offset": 2048, 00:27:01.271 "data_size": 63488 00:27:01.271 }, 00:27:01.271 { 00:27:01.271 "name": "BaseBdev4", 00:27:01.271 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:27:01.271 "is_configured": true, 00:27:01.271 "data_offset": 2048, 00:27:01.271 "data_size": 63488 00:27:01.271 } 00:27:01.271 ] 00:27:01.271 }' 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:01.271 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:01.529 [2024-07-25 07:32:33.966265] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:01.530 [2024-07-25 07:32:33.966375] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:01.530 [2024-07-25 07:32:33.966390] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:01.530 request: 00:27:01.530 { 00:27:01.530 "base_bdev": "BaseBdev1", 00:27:01.530 "raid_bdev": "raid_bdev1", 00:27:01.530 "method": "bdev_raid_add_base_bdev", 00:27:01.530 "req_id": 1 00:27:01.530 } 00:27:01.530 Got JSON-RPC error response 00:27:01.530 response: 00:27:01.530 { 00:27:01.530 "code": -22, 00:27:01.530 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:01.530 } 00:27:01.530 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:27:01.530 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:01.530 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:01.530 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:01.530 07:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:27:02.464 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:02.464 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:02.464 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:02.464 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.465 07:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.722 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:02.722 "name": "raid_bdev1", 00:27:02.722 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:27:02.722 "strip_size_kb": 0, 00:27:02.722 "state": "online", 00:27:02.722 "raid_level": "raid1", 00:27:02.722 "superblock": true, 00:27:02.722 "num_base_bdevs": 4, 00:27:02.722 "num_base_bdevs_discovered": 2, 00:27:02.722 "num_base_bdevs_operational": 2, 00:27:02.722 "base_bdevs_list": [ 00:27:02.722 { 00:27:02.722 "name": null, 00:27:02.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.722 "is_configured": false, 00:27:02.722 "data_offset": 2048, 00:27:02.722 "data_size": 63488 00:27:02.722 }, 00:27:02.722 { 00:27:02.722 "name": null, 00:27:02.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.722 "is_configured": false, 00:27:02.722 "data_offset": 2048, 00:27:02.722 "data_size": 63488 00:27:02.722 }, 00:27:02.722 { 00:27:02.722 "name": "BaseBdev3", 00:27:02.722 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:27:02.722 "is_configured": true, 00:27:02.722 "data_offset": 2048, 00:27:02.722 "data_size": 63488 00:27:02.722 }, 00:27:02.722 { 00:27:02.722 "name": "BaseBdev4", 00:27:02.722 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:27:02.722 "is_configured": true, 00:27:02.722 "data_offset": 2048, 00:27:02.722 "data_size": 63488 00:27:02.722 } 00:27:02.722 ] 00:27:02.722 }' 00:27:02.722 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:02.722 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.288 07:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.547 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.547 "name": "raid_bdev1", 00:27:03.547 "uuid": "7d2576dc-d3d2-4597-a1c1-6f52d1b1fe99", 00:27:03.547 "strip_size_kb": 0, 00:27:03.547 "state": "online", 00:27:03.547 "raid_level": "raid1", 00:27:03.547 "superblock": true, 00:27:03.547 "num_base_bdevs": 4, 00:27:03.547 "num_base_bdevs_discovered": 2, 00:27:03.547 "num_base_bdevs_operational": 2, 00:27:03.547 "base_bdevs_list": [ 00:27:03.547 { 00:27:03.547 "name": null, 00:27:03.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.547 "is_configured": false, 00:27:03.547 "data_offset": 2048, 00:27:03.547 "data_size": 63488 00:27:03.547 }, 00:27:03.547 { 00:27:03.547 "name": null, 00:27:03.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.547 "is_configured": false, 00:27:03.547 "data_offset": 2048, 00:27:03.547 "data_size": 63488 00:27:03.547 }, 00:27:03.547 { 00:27:03.547 "name": "BaseBdev3", 00:27:03.547 "uuid": "980b176c-9b4a-5d8d-b627-f20798eb6101", 00:27:03.547 "is_configured": true, 00:27:03.547 "data_offset": 2048, 00:27:03.547 "data_size": 63488 00:27:03.547 }, 00:27:03.547 { 00:27:03.547 "name": "BaseBdev4", 00:27:03.547 "uuid": "26ab1809-cd99-5920-9a1d-2e73ed93bf4c", 00:27:03.547 "is_configured": true, 00:27:03.547 "data_offset": 2048, 00:27:03.547 "data_size": 63488 00:27:03.547 } 00:27:03.547 ] 00:27:03.547 }' 00:27:03.547 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1747055 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1747055 ']' 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1747055 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1747055 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1747055' 00:27:03.806 killing process with pid 1747055 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1747055 00:27:03.806 Received shutdown signal, test time was about 26.249350 seconds 00:27:03.806 00:27:03.806 Latency(us) 00:27:03.806 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:03.806 =================================================================================================================== 00:27:03.806 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:03.806 [2024-07-25 07:32:36.179796] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:03.806 [2024-07-25 07:32:36.179887] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:03.806 [2024-07-25 07:32:36.179942] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:03.806 [2024-07-25 07:32:36.179953] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e0600 name raid_bdev1, state offline 00:27:03.806 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1747055 00:27:03.806 [2024-07-25 07:32:36.213354] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:04.065 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:27:04.065 00:27:04.065 real 0m31.492s 00:27:04.065 user 0m49.237s 00:27:04.065 sys 0m4.978s 00:27:04.065 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:04.065 07:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:04.065 ************************************ 00:27:04.065 END TEST raid_rebuild_test_sb_io 00:27:04.065 ************************************ 00:27:04.065 07:32:36 bdev_raid -- bdev/bdev_raid.sh@964 -- # '[' n == y ']' 00:27:04.065 07:32:36 bdev_raid -- bdev/bdev_raid.sh@976 -- # base_blocklen=4096 00:27:04.065 07:32:36 bdev_raid -- bdev/bdev_raid.sh@978 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:27:04.065 07:32:36 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:27:04.065 07:32:36 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:04.065 07:32:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:04.065 ************************************ 00:27:04.065 START TEST raid_state_function_test_sb_4k 00:27:04.065 ************************************ 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1752689 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1752689' 00:27:04.065 Process raid pid: 1752689 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1752689 /var/tmp/spdk-raid.sock 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1752689 ']' 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:04.065 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:04.065 07:32:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:04.065 [2024-07-25 07:32:36.557175] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:27:04.065 [2024-07-25 07:32:36.557228] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:04.324 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.324 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:04.325 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:04.325 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:04.325 [2024-07-25 07:32:36.689420] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:04.325 [2024-07-25 07:32:36.776812] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:27:04.325 [2024-07-25 07:32:36.841715] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:04.325 [2024-07-25 07:32:36.841750] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:05.261 [2024-07-25 07:32:37.669449] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:05.261 [2024-07-25 07:32:37.669487] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:05.261 [2024-07-25 07:32:37.669497] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:05.261 [2024-07-25 07:32:37.669507] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.261 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:05.520 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:05.520 "name": "Existed_Raid", 00:27:05.520 "uuid": "195693ca-bef4-4a35-b61a-17bedd65a370", 00:27:05.520 "strip_size_kb": 0, 00:27:05.520 "state": "configuring", 00:27:05.520 "raid_level": "raid1", 00:27:05.520 "superblock": true, 00:27:05.520 "num_base_bdevs": 2, 00:27:05.520 "num_base_bdevs_discovered": 0, 00:27:05.520 "num_base_bdevs_operational": 2, 00:27:05.520 "base_bdevs_list": [ 00:27:05.520 { 00:27:05.520 "name": "BaseBdev1", 00:27:05.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.520 "is_configured": false, 00:27:05.520 "data_offset": 0, 00:27:05.520 "data_size": 0 00:27:05.520 }, 00:27:05.520 { 00:27:05.520 "name": "BaseBdev2", 00:27:05.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.520 "is_configured": false, 00:27:05.520 "data_offset": 0, 00:27:05.520 "data_size": 0 00:27:05.520 } 00:27:05.520 ] 00:27:05.520 }' 00:27:05.520 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:05.520 07:32:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:06.087 07:32:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:06.345 [2024-07-25 07:32:38.720166] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:06.345 [2024-07-25 07:32:38.720192] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x989ea0 name Existed_Raid, state configuring 00:27:06.345 07:32:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:06.604 [2024-07-25 07:32:38.948855] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:06.604 [2024-07-25 07:32:38.948881] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:06.604 [2024-07-25 07:32:38.948890] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:06.604 [2024-07-25 07:32:38.948901] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:06.604 07:32:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:27:06.862 [2024-07-25 07:32:39.182927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:06.862 BaseBdev1 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:27:06.862 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:07.121 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:07.121 [ 00:27:07.121 { 00:27:07.121 "name": "BaseBdev1", 00:27:07.121 "aliases": [ 00:27:07.121 "00e24423-1256-4b03-82d2-a613e51e3a7c" 00:27:07.121 ], 00:27:07.121 "product_name": "Malloc disk", 00:27:07.121 "block_size": 4096, 00:27:07.121 "num_blocks": 8192, 00:27:07.121 "uuid": "00e24423-1256-4b03-82d2-a613e51e3a7c", 00:27:07.121 "assigned_rate_limits": { 00:27:07.121 "rw_ios_per_sec": 0, 00:27:07.121 "rw_mbytes_per_sec": 0, 00:27:07.121 "r_mbytes_per_sec": 0, 00:27:07.121 "w_mbytes_per_sec": 0 00:27:07.121 }, 00:27:07.121 "claimed": true, 00:27:07.121 "claim_type": "exclusive_write", 00:27:07.121 "zoned": false, 00:27:07.121 "supported_io_types": { 00:27:07.121 "read": true, 00:27:07.121 "write": true, 00:27:07.121 "unmap": true, 00:27:07.121 "flush": true, 00:27:07.121 "reset": true, 00:27:07.121 "nvme_admin": false, 00:27:07.121 "nvme_io": false, 00:27:07.121 "nvme_io_md": false, 00:27:07.121 "write_zeroes": true, 00:27:07.121 "zcopy": true, 00:27:07.121 "get_zone_info": false, 00:27:07.121 "zone_management": false, 00:27:07.121 "zone_append": false, 00:27:07.121 "compare": false, 00:27:07.121 "compare_and_write": false, 00:27:07.121 "abort": true, 00:27:07.121 "seek_hole": false, 00:27:07.121 "seek_data": false, 00:27:07.121 "copy": true, 00:27:07.121 "nvme_iov_md": false 00:27:07.121 }, 00:27:07.121 "memory_domains": [ 00:27:07.121 { 00:27:07.121 "dma_device_id": "system", 00:27:07.121 "dma_device_type": 1 00:27:07.121 }, 00:27:07.121 { 00:27:07.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:07.121 "dma_device_type": 2 00:27:07.121 } 00:27:07.121 ], 00:27:07.121 "driver_specific": {} 00:27:07.121 } 00:27:07.121 ] 00:27:07.121 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:07.122 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:07.380 "name": "Existed_Raid", 00:27:07.380 "uuid": "6bcd2f25-e3c8-4b42-984f-466f021eeb7c", 00:27:07.380 "strip_size_kb": 0, 00:27:07.380 "state": "configuring", 00:27:07.380 "raid_level": "raid1", 00:27:07.380 "superblock": true, 00:27:07.380 "num_base_bdevs": 2, 00:27:07.380 "num_base_bdevs_discovered": 1, 00:27:07.380 "num_base_bdevs_operational": 2, 00:27:07.380 "base_bdevs_list": [ 00:27:07.380 { 00:27:07.380 "name": "BaseBdev1", 00:27:07.380 "uuid": "00e24423-1256-4b03-82d2-a613e51e3a7c", 00:27:07.380 "is_configured": true, 00:27:07.380 "data_offset": 256, 00:27:07.380 "data_size": 7936 00:27:07.380 }, 00:27:07.380 { 00:27:07.380 "name": "BaseBdev2", 00:27:07.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.380 "is_configured": false, 00:27:07.380 "data_offset": 0, 00:27:07.380 "data_size": 0 00:27:07.380 } 00:27:07.380 ] 00:27:07.380 }' 00:27:07.380 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:07.381 07:32:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:07.947 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:08.205 [2024-07-25 07:32:40.610688] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:08.205 [2024-07-25 07:32:40.610723] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x989790 name Existed_Raid, state configuring 00:27:08.205 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:08.464 [2024-07-25 07:32:40.839319] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:08.464 [2024-07-25 07:32:40.840702] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:08.464 [2024-07-25 07:32:40.840738] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.464 07:32:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:08.722 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.722 "name": "Existed_Raid", 00:27:08.722 "uuid": "376486f1-80b6-45a8-97bd-873176f81254", 00:27:08.722 "strip_size_kb": 0, 00:27:08.722 "state": "configuring", 00:27:08.722 "raid_level": "raid1", 00:27:08.722 "superblock": true, 00:27:08.722 "num_base_bdevs": 2, 00:27:08.722 "num_base_bdevs_discovered": 1, 00:27:08.722 "num_base_bdevs_operational": 2, 00:27:08.722 "base_bdevs_list": [ 00:27:08.722 { 00:27:08.722 "name": "BaseBdev1", 00:27:08.722 "uuid": "00e24423-1256-4b03-82d2-a613e51e3a7c", 00:27:08.722 "is_configured": true, 00:27:08.722 "data_offset": 256, 00:27:08.722 "data_size": 7936 00:27:08.723 }, 00:27:08.723 { 00:27:08.723 "name": "BaseBdev2", 00:27:08.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.723 "is_configured": false, 00:27:08.723 "data_offset": 0, 00:27:08.723 "data_size": 0 00:27:08.723 } 00:27:08.723 ] 00:27:08.723 }' 00:27:08.723 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.723 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:09.290 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:27:09.548 [2024-07-25 07:32:41.881290] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:09.548 [2024-07-25 07:32:41.881427] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x98a580 00:27:09.548 [2024-07-25 07:32:41.881440] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:09.548 [2024-07-25 07:32:41.881598] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98bb20 00:27:09.548 [2024-07-25 07:32:41.881717] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x98a580 00:27:09.548 [2024-07-25 07:32:41.881726] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x98a580 00:27:09.548 [2024-07-25 07:32:41.881811] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:09.548 BaseBdev2 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:27:09.548 07:32:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:09.806 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:10.065 [ 00:27:10.065 { 00:27:10.065 "name": "BaseBdev2", 00:27:10.065 "aliases": [ 00:27:10.065 "eaa90c99-6002-47e2-b93e-babaecf67f23" 00:27:10.065 ], 00:27:10.065 "product_name": "Malloc disk", 00:27:10.065 "block_size": 4096, 00:27:10.065 "num_blocks": 8192, 00:27:10.065 "uuid": "eaa90c99-6002-47e2-b93e-babaecf67f23", 00:27:10.065 "assigned_rate_limits": { 00:27:10.065 "rw_ios_per_sec": 0, 00:27:10.065 "rw_mbytes_per_sec": 0, 00:27:10.065 "r_mbytes_per_sec": 0, 00:27:10.065 "w_mbytes_per_sec": 0 00:27:10.065 }, 00:27:10.065 "claimed": true, 00:27:10.065 "claim_type": "exclusive_write", 00:27:10.065 "zoned": false, 00:27:10.065 "supported_io_types": { 00:27:10.065 "read": true, 00:27:10.065 "write": true, 00:27:10.065 "unmap": true, 00:27:10.065 "flush": true, 00:27:10.065 "reset": true, 00:27:10.065 "nvme_admin": false, 00:27:10.065 "nvme_io": false, 00:27:10.065 "nvme_io_md": false, 00:27:10.065 "write_zeroes": true, 00:27:10.065 "zcopy": true, 00:27:10.065 "get_zone_info": false, 00:27:10.065 "zone_management": false, 00:27:10.065 "zone_append": false, 00:27:10.065 "compare": false, 00:27:10.065 "compare_and_write": false, 00:27:10.065 "abort": true, 00:27:10.065 "seek_hole": false, 00:27:10.065 "seek_data": false, 00:27:10.065 "copy": true, 00:27:10.065 "nvme_iov_md": false 00:27:10.065 }, 00:27:10.065 "memory_domains": [ 00:27:10.065 { 00:27:10.065 "dma_device_id": "system", 00:27:10.065 "dma_device_type": 1 00:27:10.065 }, 00:27:10.065 { 00:27:10.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.065 "dma_device_type": 2 00:27:10.065 } 00:27:10.065 ], 00:27:10.065 "driver_specific": {} 00:27:10.065 } 00:27:10.065 ] 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.065 "name": "Existed_Raid", 00:27:10.065 "uuid": "376486f1-80b6-45a8-97bd-873176f81254", 00:27:10.065 "strip_size_kb": 0, 00:27:10.065 "state": "online", 00:27:10.065 "raid_level": "raid1", 00:27:10.065 "superblock": true, 00:27:10.065 "num_base_bdevs": 2, 00:27:10.065 "num_base_bdevs_discovered": 2, 00:27:10.065 "num_base_bdevs_operational": 2, 00:27:10.065 "base_bdevs_list": [ 00:27:10.065 { 00:27:10.065 "name": "BaseBdev1", 00:27:10.065 "uuid": "00e24423-1256-4b03-82d2-a613e51e3a7c", 00:27:10.065 "is_configured": true, 00:27:10.065 "data_offset": 256, 00:27:10.065 "data_size": 7936 00:27:10.065 }, 00:27:10.065 { 00:27:10.065 "name": "BaseBdev2", 00:27:10.065 "uuid": "eaa90c99-6002-47e2-b93e-babaecf67f23", 00:27:10.065 "is_configured": true, 00:27:10.065 "data_offset": 256, 00:27:10.065 "data_size": 7936 00:27:10.065 } 00:27:10.065 ] 00:27:10.065 }' 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.065 07:32:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:10.631 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:10.890 [2024-07-25 07:32:43.357432] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:10.890 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:10.890 "name": "Existed_Raid", 00:27:10.890 "aliases": [ 00:27:10.890 "376486f1-80b6-45a8-97bd-873176f81254" 00:27:10.890 ], 00:27:10.890 "product_name": "Raid Volume", 00:27:10.890 "block_size": 4096, 00:27:10.890 "num_blocks": 7936, 00:27:10.890 "uuid": "376486f1-80b6-45a8-97bd-873176f81254", 00:27:10.890 "assigned_rate_limits": { 00:27:10.890 "rw_ios_per_sec": 0, 00:27:10.890 "rw_mbytes_per_sec": 0, 00:27:10.890 "r_mbytes_per_sec": 0, 00:27:10.890 "w_mbytes_per_sec": 0 00:27:10.890 }, 00:27:10.890 "claimed": false, 00:27:10.890 "zoned": false, 00:27:10.890 "supported_io_types": { 00:27:10.890 "read": true, 00:27:10.890 "write": true, 00:27:10.890 "unmap": false, 00:27:10.890 "flush": false, 00:27:10.890 "reset": true, 00:27:10.890 "nvme_admin": false, 00:27:10.890 "nvme_io": false, 00:27:10.890 "nvme_io_md": false, 00:27:10.890 "write_zeroes": true, 00:27:10.890 "zcopy": false, 00:27:10.890 "get_zone_info": false, 00:27:10.890 "zone_management": false, 00:27:10.890 "zone_append": false, 00:27:10.890 "compare": false, 00:27:10.890 "compare_and_write": false, 00:27:10.890 "abort": false, 00:27:10.890 "seek_hole": false, 00:27:10.890 "seek_data": false, 00:27:10.890 "copy": false, 00:27:10.890 "nvme_iov_md": false 00:27:10.890 }, 00:27:10.890 "memory_domains": [ 00:27:10.890 { 00:27:10.890 "dma_device_id": "system", 00:27:10.890 "dma_device_type": 1 00:27:10.890 }, 00:27:10.890 { 00:27:10.890 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.890 "dma_device_type": 2 00:27:10.890 }, 00:27:10.890 { 00:27:10.890 "dma_device_id": "system", 00:27:10.890 "dma_device_type": 1 00:27:10.890 }, 00:27:10.890 { 00:27:10.890 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.890 "dma_device_type": 2 00:27:10.890 } 00:27:10.890 ], 00:27:10.890 "driver_specific": { 00:27:10.890 "raid": { 00:27:10.890 "uuid": "376486f1-80b6-45a8-97bd-873176f81254", 00:27:10.890 "strip_size_kb": 0, 00:27:10.890 "state": "online", 00:27:10.890 "raid_level": "raid1", 00:27:10.890 "superblock": true, 00:27:10.890 "num_base_bdevs": 2, 00:27:10.890 "num_base_bdevs_discovered": 2, 00:27:10.890 "num_base_bdevs_operational": 2, 00:27:10.890 "base_bdevs_list": [ 00:27:10.890 { 00:27:10.890 "name": "BaseBdev1", 00:27:10.890 "uuid": "00e24423-1256-4b03-82d2-a613e51e3a7c", 00:27:10.890 "is_configured": true, 00:27:10.890 "data_offset": 256, 00:27:10.890 "data_size": 7936 00:27:10.890 }, 00:27:10.890 { 00:27:10.890 "name": "BaseBdev2", 00:27:10.890 "uuid": "eaa90c99-6002-47e2-b93e-babaecf67f23", 00:27:10.890 "is_configured": true, 00:27:10.890 "data_offset": 256, 00:27:10.890 "data_size": 7936 00:27:10.890 } 00:27:10.890 ] 00:27:10.890 } 00:27:10.890 } 00:27:10.890 }' 00:27:10.890 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:10.890 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:10.890 BaseBdev2' 00:27:10.890 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:10.890 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:10.890 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:11.148 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:11.148 "name": "BaseBdev1", 00:27:11.148 "aliases": [ 00:27:11.148 "00e24423-1256-4b03-82d2-a613e51e3a7c" 00:27:11.148 ], 00:27:11.148 "product_name": "Malloc disk", 00:27:11.148 "block_size": 4096, 00:27:11.148 "num_blocks": 8192, 00:27:11.148 "uuid": "00e24423-1256-4b03-82d2-a613e51e3a7c", 00:27:11.148 "assigned_rate_limits": { 00:27:11.148 "rw_ios_per_sec": 0, 00:27:11.148 "rw_mbytes_per_sec": 0, 00:27:11.148 "r_mbytes_per_sec": 0, 00:27:11.148 "w_mbytes_per_sec": 0 00:27:11.148 }, 00:27:11.148 "claimed": true, 00:27:11.148 "claim_type": "exclusive_write", 00:27:11.148 "zoned": false, 00:27:11.148 "supported_io_types": { 00:27:11.148 "read": true, 00:27:11.148 "write": true, 00:27:11.148 "unmap": true, 00:27:11.148 "flush": true, 00:27:11.148 "reset": true, 00:27:11.148 "nvme_admin": false, 00:27:11.148 "nvme_io": false, 00:27:11.148 "nvme_io_md": false, 00:27:11.148 "write_zeroes": true, 00:27:11.148 "zcopy": true, 00:27:11.148 "get_zone_info": false, 00:27:11.148 "zone_management": false, 00:27:11.148 "zone_append": false, 00:27:11.148 "compare": false, 00:27:11.148 "compare_and_write": false, 00:27:11.148 "abort": true, 00:27:11.148 "seek_hole": false, 00:27:11.148 "seek_data": false, 00:27:11.148 "copy": true, 00:27:11.148 "nvme_iov_md": false 00:27:11.148 }, 00:27:11.148 "memory_domains": [ 00:27:11.148 { 00:27:11.148 "dma_device_id": "system", 00:27:11.148 "dma_device_type": 1 00:27:11.148 }, 00:27:11.148 { 00:27:11.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.148 "dma_device_type": 2 00:27:11.148 } 00:27:11.148 ], 00:27:11.148 "driver_specific": {} 00:27:11.148 }' 00:27:11.148 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:11.406 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:11.666 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:11.666 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:11.666 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:11.666 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:11.666 07:32:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:11.925 "name": "BaseBdev2", 00:27:11.925 "aliases": [ 00:27:11.925 "eaa90c99-6002-47e2-b93e-babaecf67f23" 00:27:11.925 ], 00:27:11.925 "product_name": "Malloc disk", 00:27:11.925 "block_size": 4096, 00:27:11.925 "num_blocks": 8192, 00:27:11.925 "uuid": "eaa90c99-6002-47e2-b93e-babaecf67f23", 00:27:11.925 "assigned_rate_limits": { 00:27:11.925 "rw_ios_per_sec": 0, 00:27:11.925 "rw_mbytes_per_sec": 0, 00:27:11.925 "r_mbytes_per_sec": 0, 00:27:11.925 "w_mbytes_per_sec": 0 00:27:11.925 }, 00:27:11.925 "claimed": true, 00:27:11.925 "claim_type": "exclusive_write", 00:27:11.925 "zoned": false, 00:27:11.925 "supported_io_types": { 00:27:11.925 "read": true, 00:27:11.925 "write": true, 00:27:11.925 "unmap": true, 00:27:11.925 "flush": true, 00:27:11.925 "reset": true, 00:27:11.925 "nvme_admin": false, 00:27:11.925 "nvme_io": false, 00:27:11.925 "nvme_io_md": false, 00:27:11.925 "write_zeroes": true, 00:27:11.925 "zcopy": true, 00:27:11.925 "get_zone_info": false, 00:27:11.925 "zone_management": false, 00:27:11.925 "zone_append": false, 00:27:11.925 "compare": false, 00:27:11.925 "compare_and_write": false, 00:27:11.925 "abort": true, 00:27:11.925 "seek_hole": false, 00:27:11.925 "seek_data": false, 00:27:11.925 "copy": true, 00:27:11.925 "nvme_iov_md": false 00:27:11.925 }, 00:27:11.925 "memory_domains": [ 00:27:11.925 { 00:27:11.925 "dma_device_id": "system", 00:27:11.925 "dma_device_type": 1 00:27:11.925 }, 00:27:11.925 { 00:27:11.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.925 "dma_device_type": 2 00:27:11.925 } 00:27:11.925 ], 00:27:11.925 "driver_specific": {} 00:27:11.925 }' 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.925 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.183 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:12.183 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.183 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.183 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:12.183 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:12.442 [2024-07-25 07:32:44.757094] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.442 07:32:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:12.701 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.701 "name": "Existed_Raid", 00:27:12.701 "uuid": "376486f1-80b6-45a8-97bd-873176f81254", 00:27:12.701 "strip_size_kb": 0, 00:27:12.701 "state": "online", 00:27:12.701 "raid_level": "raid1", 00:27:12.701 "superblock": true, 00:27:12.701 "num_base_bdevs": 2, 00:27:12.701 "num_base_bdevs_discovered": 1, 00:27:12.701 "num_base_bdevs_operational": 1, 00:27:12.701 "base_bdevs_list": [ 00:27:12.701 { 00:27:12.701 "name": null, 00:27:12.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.701 "is_configured": false, 00:27:12.701 "data_offset": 256, 00:27:12.701 "data_size": 7936 00:27:12.701 }, 00:27:12.701 { 00:27:12.701 "name": "BaseBdev2", 00:27:12.701 "uuid": "eaa90c99-6002-47e2-b93e-babaecf67f23", 00:27:12.701 "is_configured": true, 00:27:12.701 "data_offset": 256, 00:27:12.701 "data_size": 7936 00:27:12.701 } 00:27:12.701 ] 00:27:12.701 }' 00:27:12.701 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.701 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:13.268 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:13.268 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:13.268 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.268 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:13.526 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:13.526 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:13.526 07:32:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:13.526 [2024-07-25 07:32:46.029526] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:13.526 [2024-07-25 07:32:46.029602] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:13.526 [2024-07-25 07:32:46.039757] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:13.526 [2024-07-25 07:32:46.039787] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:13.526 [2024-07-25 07:32:46.039798] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x98a580 name Existed_Raid, state offline 00:27:13.526 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:13.526 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1752689 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1752689 ']' 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1752689 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:13.785 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1752689 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1752689' 00:27:14.047 killing process with pid 1752689 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1752689 00:27:14.047 [2024-07-25 07:32:46.346364] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1752689 00:27:14.047 [2024-07-25 07:32:46.347223] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:14.047 00:27:14.047 real 0m10.045s 00:27:14.047 user 0m17.824s 00:27:14.047 sys 0m1.885s 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:14.047 07:32:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:14.047 ************************************ 00:27:14.047 END TEST raid_state_function_test_sb_4k 00:27:14.047 ************************************ 00:27:14.306 07:32:46 bdev_raid -- bdev/bdev_raid.sh@979 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:14.306 07:32:46 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:27:14.306 07:32:46 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:14.306 07:32:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:14.306 ************************************ 00:27:14.306 START TEST raid_superblock_test_4k 00:27:14.306 ************************************ 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@414 -- # local strip_size 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@427 -- # raid_pid=1754679 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@428 -- # waitforlisten 1754679 /var/tmp/spdk-raid.sock 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # '[' -z 1754679 ']' 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:14.306 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:14.306 07:32:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:14.306 [2024-07-25 07:32:46.675637] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:27:14.306 [2024-07-25 07:32:46.675690] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1754679 ] 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:14.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.306 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:14.307 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:14.307 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:14.307 [2024-07-25 07:32:46.809832] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:14.565 [2024-07-25 07:32:46.896986] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:27:14.565 [2024-07-25 07:32:46.950864] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:14.565 [2024-07-25 07:32:46.950891] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@864 -- # return 0 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:15.132 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:15.391 malloc1 00:27:15.391 07:32:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:15.649 [2024-07-25 07:32:48.022407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:15.649 [2024-07-25 07:32:48.022451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:15.649 [2024-07-25 07:32:48.022470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1608280 00:27:15.649 [2024-07-25 07:32:48.022482] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:15.649 [2024-07-25 07:32:48.024002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:15.649 [2024-07-25 07:32:48.024030] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:15.649 pt1 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:15.649 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:15.907 malloc2 00:27:15.907 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:16.166 [2024-07-25 07:32:48.484127] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:16.166 [2024-07-25 07:32:48.484172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.166 [2024-07-25 07:32:48.484189] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b38c0 00:27:16.166 [2024-07-25 07:32:48.484200] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.166 [2024-07-25 07:32:48.485534] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.166 [2024-07-25 07:32:48.485561] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:16.166 pt2 00:27:16.166 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:27:16.166 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:16.166 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:16.166 [2024-07-25 07:32:48.696696] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:16.166 [2024-07-25 07:32:48.697808] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:16.166 [2024-07-25 07:32:48.697951] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b1720 00:27:16.166 [2024-07-25 07:32:48.697963] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:16.166 [2024-07-25 07:32:48.698134] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16096e0 00:27:16.166 [2024-07-25 07:32:48.698272] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b1720 00:27:16.166 [2024-07-25 07:32:48.698282] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17b1720 00:27:16.166 [2024-07-25 07:32:48.698369] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.425 "name": "raid_bdev1", 00:27:16.425 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:16.425 "strip_size_kb": 0, 00:27:16.425 "state": "online", 00:27:16.425 "raid_level": "raid1", 00:27:16.425 "superblock": true, 00:27:16.425 "num_base_bdevs": 2, 00:27:16.425 "num_base_bdevs_discovered": 2, 00:27:16.425 "num_base_bdevs_operational": 2, 00:27:16.425 "base_bdevs_list": [ 00:27:16.425 { 00:27:16.425 "name": "pt1", 00:27:16.425 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:16.425 "is_configured": true, 00:27:16.425 "data_offset": 256, 00:27:16.425 "data_size": 7936 00:27:16.425 }, 00:27:16.425 { 00:27:16.425 "name": "pt2", 00:27:16.425 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:16.425 "is_configured": true, 00:27:16.425 "data_offset": 256, 00:27:16.425 "data_size": 7936 00:27:16.425 } 00:27:16.425 ] 00:27:16.425 }' 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.425 07:32:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:16.991 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:17.250 [2024-07-25 07:32:49.715571] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:17.250 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:17.250 "name": "raid_bdev1", 00:27:17.250 "aliases": [ 00:27:17.250 "1c81c93a-38c3-4581-9856-6b51a0faad22" 00:27:17.250 ], 00:27:17.250 "product_name": "Raid Volume", 00:27:17.250 "block_size": 4096, 00:27:17.250 "num_blocks": 7936, 00:27:17.250 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:17.250 "assigned_rate_limits": { 00:27:17.250 "rw_ios_per_sec": 0, 00:27:17.250 "rw_mbytes_per_sec": 0, 00:27:17.250 "r_mbytes_per_sec": 0, 00:27:17.250 "w_mbytes_per_sec": 0 00:27:17.250 }, 00:27:17.250 "claimed": false, 00:27:17.250 "zoned": false, 00:27:17.250 "supported_io_types": { 00:27:17.250 "read": true, 00:27:17.250 "write": true, 00:27:17.250 "unmap": false, 00:27:17.250 "flush": false, 00:27:17.250 "reset": true, 00:27:17.250 "nvme_admin": false, 00:27:17.250 "nvme_io": false, 00:27:17.250 "nvme_io_md": false, 00:27:17.250 "write_zeroes": true, 00:27:17.250 "zcopy": false, 00:27:17.250 "get_zone_info": false, 00:27:17.250 "zone_management": false, 00:27:17.250 "zone_append": false, 00:27:17.250 "compare": false, 00:27:17.250 "compare_and_write": false, 00:27:17.250 "abort": false, 00:27:17.250 "seek_hole": false, 00:27:17.250 "seek_data": false, 00:27:17.250 "copy": false, 00:27:17.250 "nvme_iov_md": false 00:27:17.250 }, 00:27:17.250 "memory_domains": [ 00:27:17.250 { 00:27:17.250 "dma_device_id": "system", 00:27:17.250 "dma_device_type": 1 00:27:17.250 }, 00:27:17.250 { 00:27:17.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.250 "dma_device_type": 2 00:27:17.250 }, 00:27:17.250 { 00:27:17.250 "dma_device_id": "system", 00:27:17.250 "dma_device_type": 1 00:27:17.250 }, 00:27:17.250 { 00:27:17.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.250 "dma_device_type": 2 00:27:17.250 } 00:27:17.250 ], 00:27:17.250 "driver_specific": { 00:27:17.250 "raid": { 00:27:17.250 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:17.250 "strip_size_kb": 0, 00:27:17.250 "state": "online", 00:27:17.250 "raid_level": "raid1", 00:27:17.250 "superblock": true, 00:27:17.250 "num_base_bdevs": 2, 00:27:17.250 "num_base_bdevs_discovered": 2, 00:27:17.250 "num_base_bdevs_operational": 2, 00:27:17.250 "base_bdevs_list": [ 00:27:17.250 { 00:27:17.250 "name": "pt1", 00:27:17.250 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:17.250 "is_configured": true, 00:27:17.250 "data_offset": 256, 00:27:17.250 "data_size": 7936 00:27:17.250 }, 00:27:17.250 { 00:27:17.250 "name": "pt2", 00:27:17.250 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:17.250 "is_configured": true, 00:27:17.250 "data_offset": 256, 00:27:17.250 "data_size": 7936 00:27:17.250 } 00:27:17.250 ] 00:27:17.250 } 00:27:17.250 } 00:27:17.250 }' 00:27:17.250 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:17.250 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:17.250 pt2' 00:27:17.508 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:17.508 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:17.508 07:32:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:17.508 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:17.508 "name": "pt1", 00:27:17.508 "aliases": [ 00:27:17.508 "00000000-0000-0000-0000-000000000001" 00:27:17.508 ], 00:27:17.508 "product_name": "passthru", 00:27:17.508 "block_size": 4096, 00:27:17.508 "num_blocks": 8192, 00:27:17.508 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:17.508 "assigned_rate_limits": { 00:27:17.508 "rw_ios_per_sec": 0, 00:27:17.508 "rw_mbytes_per_sec": 0, 00:27:17.508 "r_mbytes_per_sec": 0, 00:27:17.508 "w_mbytes_per_sec": 0 00:27:17.508 }, 00:27:17.508 "claimed": true, 00:27:17.508 "claim_type": "exclusive_write", 00:27:17.508 "zoned": false, 00:27:17.508 "supported_io_types": { 00:27:17.508 "read": true, 00:27:17.508 "write": true, 00:27:17.508 "unmap": true, 00:27:17.508 "flush": true, 00:27:17.508 "reset": true, 00:27:17.508 "nvme_admin": false, 00:27:17.508 "nvme_io": false, 00:27:17.508 "nvme_io_md": false, 00:27:17.508 "write_zeroes": true, 00:27:17.508 "zcopy": true, 00:27:17.508 "get_zone_info": false, 00:27:17.508 "zone_management": false, 00:27:17.508 "zone_append": false, 00:27:17.508 "compare": false, 00:27:17.508 "compare_and_write": false, 00:27:17.508 "abort": true, 00:27:17.508 "seek_hole": false, 00:27:17.508 "seek_data": false, 00:27:17.508 "copy": true, 00:27:17.508 "nvme_iov_md": false 00:27:17.508 }, 00:27:17.508 "memory_domains": [ 00:27:17.508 { 00:27:17.508 "dma_device_id": "system", 00:27:17.508 "dma_device_type": 1 00:27:17.508 }, 00:27:17.508 { 00:27:17.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.508 "dma_device_type": 2 00:27:17.508 } 00:27:17.508 ], 00:27:17.508 "driver_specific": { 00:27:17.508 "passthru": { 00:27:17.508 "name": "pt1", 00:27:17.508 "base_bdev_name": "malloc1" 00:27:17.508 } 00:27:17.508 } 00:27:17.508 }' 00:27:17.508 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:17.767 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.025 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.026 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:18.026 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:18.026 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:18.026 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:18.283 "name": "pt2", 00:27:18.283 "aliases": [ 00:27:18.283 "00000000-0000-0000-0000-000000000002" 00:27:18.283 ], 00:27:18.283 "product_name": "passthru", 00:27:18.283 "block_size": 4096, 00:27:18.283 "num_blocks": 8192, 00:27:18.283 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:18.283 "assigned_rate_limits": { 00:27:18.283 "rw_ios_per_sec": 0, 00:27:18.283 "rw_mbytes_per_sec": 0, 00:27:18.283 "r_mbytes_per_sec": 0, 00:27:18.283 "w_mbytes_per_sec": 0 00:27:18.283 }, 00:27:18.283 "claimed": true, 00:27:18.283 "claim_type": "exclusive_write", 00:27:18.283 "zoned": false, 00:27:18.283 "supported_io_types": { 00:27:18.283 "read": true, 00:27:18.283 "write": true, 00:27:18.283 "unmap": true, 00:27:18.283 "flush": true, 00:27:18.283 "reset": true, 00:27:18.283 "nvme_admin": false, 00:27:18.283 "nvme_io": false, 00:27:18.283 "nvme_io_md": false, 00:27:18.283 "write_zeroes": true, 00:27:18.283 "zcopy": true, 00:27:18.283 "get_zone_info": false, 00:27:18.283 "zone_management": false, 00:27:18.283 "zone_append": false, 00:27:18.283 "compare": false, 00:27:18.283 "compare_and_write": false, 00:27:18.283 "abort": true, 00:27:18.283 "seek_hole": false, 00:27:18.283 "seek_data": false, 00:27:18.283 "copy": true, 00:27:18.283 "nvme_iov_md": false 00:27:18.283 }, 00:27:18.283 "memory_domains": [ 00:27:18.283 { 00:27:18.283 "dma_device_id": "system", 00:27:18.283 "dma_device_type": 1 00:27:18.283 }, 00:27:18.283 { 00:27:18.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:18.283 "dma_device_type": 2 00:27:18.283 } 00:27:18.283 ], 00:27:18.283 "driver_specific": { 00:27:18.283 "passthru": { 00:27:18.283 "name": "pt2", 00:27:18.283 "base_bdev_name": "malloc2" 00:27:18.283 } 00:27:18.283 } 00:27:18.283 }' 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.283 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.541 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:18.541 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.541 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.541 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:18.541 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:27:18.541 07:32:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:18.799 [2024-07-25 07:32:51.103222] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:18.799 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=1c81c93a-38c3-4581-9856-6b51a0faad22 00:27:18.799 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' -z 1c81c93a-38c3-4581-9856-6b51a0faad22 ']' 00:27:18.799 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:18.799 [2024-07-25 07:32:51.331596] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:18.799 [2024-07-25 07:32:51.331613] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:18.799 [2024-07-25 07:32:51.331660] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:18.799 [2024-07-25 07:32:51.331711] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:18.799 [2024-07-25 07:32:51.331722] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b1720 name raid_bdev1, state offline 00:27:19.057 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.057 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:27:19.057 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:27:19.057 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:27:19.057 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:27:19.057 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:19.315 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:27:19.315 07:32:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:19.573 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:19.573 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # local es=0 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:19.831 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:20.089 [2024-07-25 07:32:52.490599] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:20.089 [2024-07-25 07:32:52.491838] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:20.089 [2024-07-25 07:32:52.491887] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:20.089 [2024-07-25 07:32:52.491924] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:20.089 [2024-07-25 07:32:52.491942] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:20.089 [2024-07-25 07:32:52.491950] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b2d40 name raid_bdev1, state configuring 00:27:20.089 request: 00:27:20.089 { 00:27:20.089 "name": "raid_bdev1", 00:27:20.089 "raid_level": "raid1", 00:27:20.089 "base_bdevs": [ 00:27:20.089 "malloc1", 00:27:20.089 "malloc2" 00:27:20.089 ], 00:27:20.089 "superblock": false, 00:27:20.089 "method": "bdev_raid_create", 00:27:20.089 "req_id": 1 00:27:20.089 } 00:27:20.089 Got JSON-RPC error response 00:27:20.089 response: 00:27:20.089 { 00:27:20.089 "code": -17, 00:27:20.089 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:20.090 } 00:27:20.090 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # es=1 00:27:20.090 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:20.090 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:20.090 07:32:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:20.090 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.090 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:27:20.348 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:27:20.348 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:27:20.348 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:20.606 [2024-07-25 07:32:52.947751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:20.606 [2024-07-25 07:32:52.947792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.606 [2024-07-25 07:32:52.947808] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b3100 00:27:20.606 [2024-07-25 07:32:52.947820] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.606 [2024-07-25 07:32:52.949277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.606 [2024-07-25 07:32:52.949304] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:20.606 [2024-07-25 07:32:52.949362] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:20.606 [2024-07-25 07:32:52.949386] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:20.606 pt1 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.606 07:32:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.864 07:32:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.864 "name": "raid_bdev1", 00:27:20.864 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:20.864 "strip_size_kb": 0, 00:27:20.864 "state": "configuring", 00:27:20.864 "raid_level": "raid1", 00:27:20.864 "superblock": true, 00:27:20.864 "num_base_bdevs": 2, 00:27:20.864 "num_base_bdevs_discovered": 1, 00:27:20.864 "num_base_bdevs_operational": 2, 00:27:20.864 "base_bdevs_list": [ 00:27:20.864 { 00:27:20.864 "name": "pt1", 00:27:20.864 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:20.864 "is_configured": true, 00:27:20.864 "data_offset": 256, 00:27:20.864 "data_size": 7936 00:27:20.864 }, 00:27:20.864 { 00:27:20.864 "name": null, 00:27:20.864 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:20.864 "is_configured": false, 00:27:20.864 "data_offset": 256, 00:27:20.864 "data_size": 7936 00:27:20.864 } 00:27:20.864 ] 00:27:20.864 }' 00:27:20.864 07:32:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.864 07:32:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:21.431 07:32:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:27:21.431 07:32:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:27:21.431 07:32:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:27:21.431 07:32:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:21.689 [2024-07-25 07:32:53.982483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:21.689 [2024-07-25 07:32:53.982527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.689 [2024-07-25 07:32:53.982544] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16084b0 00:27:21.689 [2024-07-25 07:32:53.982556] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.689 [2024-07-25 07:32:53.982849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.689 [2024-07-25 07:32:53.982865] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:21.689 [2024-07-25 07:32:53.982916] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:21.689 [2024-07-25 07:32:53.982934] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:21.689 [2024-07-25 07:32:53.983022] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1606bc0 00:27:21.689 [2024-07-25 07:32:53.983031] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:21.689 [2024-07-25 07:32:53.983200] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1608ed0 00:27:21.689 [2024-07-25 07:32:53.983319] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1606bc0 00:27:21.689 [2024-07-25 07:32:53.983329] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1606bc0 00:27:21.689 [2024-07-25 07:32:53.983418] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.689 pt2 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.689 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.947 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.947 "name": "raid_bdev1", 00:27:21.947 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:21.947 "strip_size_kb": 0, 00:27:21.947 "state": "online", 00:27:21.947 "raid_level": "raid1", 00:27:21.947 "superblock": true, 00:27:21.947 "num_base_bdevs": 2, 00:27:21.947 "num_base_bdevs_discovered": 2, 00:27:21.947 "num_base_bdevs_operational": 2, 00:27:21.947 "base_bdevs_list": [ 00:27:21.947 { 00:27:21.947 "name": "pt1", 00:27:21.947 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:21.947 "is_configured": true, 00:27:21.947 "data_offset": 256, 00:27:21.947 "data_size": 7936 00:27:21.947 }, 00:27:21.947 { 00:27:21.947 "name": "pt2", 00:27:21.947 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:21.947 "is_configured": true, 00:27:21.947 "data_offset": 256, 00:27:21.947 "data_size": 7936 00:27:21.947 } 00:27:21.947 ] 00:27:21.947 }' 00:27:21.947 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.947 07:32:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:22.513 07:32:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:22.513 [2024-07-25 07:32:55.029489] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:22.772 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:22.772 "name": "raid_bdev1", 00:27:22.772 "aliases": [ 00:27:22.772 "1c81c93a-38c3-4581-9856-6b51a0faad22" 00:27:22.772 ], 00:27:22.772 "product_name": "Raid Volume", 00:27:22.772 "block_size": 4096, 00:27:22.772 "num_blocks": 7936, 00:27:22.772 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:22.772 "assigned_rate_limits": { 00:27:22.772 "rw_ios_per_sec": 0, 00:27:22.772 "rw_mbytes_per_sec": 0, 00:27:22.772 "r_mbytes_per_sec": 0, 00:27:22.772 "w_mbytes_per_sec": 0 00:27:22.772 }, 00:27:22.772 "claimed": false, 00:27:22.772 "zoned": false, 00:27:22.772 "supported_io_types": { 00:27:22.772 "read": true, 00:27:22.772 "write": true, 00:27:22.772 "unmap": false, 00:27:22.772 "flush": false, 00:27:22.772 "reset": true, 00:27:22.772 "nvme_admin": false, 00:27:22.772 "nvme_io": false, 00:27:22.772 "nvme_io_md": false, 00:27:22.772 "write_zeroes": true, 00:27:22.772 "zcopy": false, 00:27:22.772 "get_zone_info": false, 00:27:22.772 "zone_management": false, 00:27:22.772 "zone_append": false, 00:27:22.772 "compare": false, 00:27:22.772 "compare_and_write": false, 00:27:22.772 "abort": false, 00:27:22.772 "seek_hole": false, 00:27:22.772 "seek_data": false, 00:27:22.772 "copy": false, 00:27:22.772 "nvme_iov_md": false 00:27:22.772 }, 00:27:22.772 "memory_domains": [ 00:27:22.772 { 00:27:22.772 "dma_device_id": "system", 00:27:22.772 "dma_device_type": 1 00:27:22.772 }, 00:27:22.772 { 00:27:22.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.772 "dma_device_type": 2 00:27:22.772 }, 00:27:22.772 { 00:27:22.772 "dma_device_id": "system", 00:27:22.772 "dma_device_type": 1 00:27:22.772 }, 00:27:22.772 { 00:27:22.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.772 "dma_device_type": 2 00:27:22.772 } 00:27:22.772 ], 00:27:22.772 "driver_specific": { 00:27:22.772 "raid": { 00:27:22.772 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:22.772 "strip_size_kb": 0, 00:27:22.772 "state": "online", 00:27:22.772 "raid_level": "raid1", 00:27:22.772 "superblock": true, 00:27:22.772 "num_base_bdevs": 2, 00:27:22.772 "num_base_bdevs_discovered": 2, 00:27:22.772 "num_base_bdevs_operational": 2, 00:27:22.772 "base_bdevs_list": [ 00:27:22.772 { 00:27:22.772 "name": "pt1", 00:27:22.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:22.772 "is_configured": true, 00:27:22.772 "data_offset": 256, 00:27:22.772 "data_size": 7936 00:27:22.772 }, 00:27:22.772 { 00:27:22.772 "name": "pt2", 00:27:22.772 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:22.772 "is_configured": true, 00:27:22.772 "data_offset": 256, 00:27:22.772 "data_size": 7936 00:27:22.772 } 00:27:22.772 ] 00:27:22.772 } 00:27:22.772 } 00:27:22.772 }' 00:27:22.772 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:22.772 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:22.772 pt2' 00:27:22.772 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:22.772 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:22.772 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:23.036 "name": "pt1", 00:27:23.036 "aliases": [ 00:27:23.036 "00000000-0000-0000-0000-000000000001" 00:27:23.036 ], 00:27:23.036 "product_name": "passthru", 00:27:23.036 "block_size": 4096, 00:27:23.036 "num_blocks": 8192, 00:27:23.036 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:23.036 "assigned_rate_limits": { 00:27:23.036 "rw_ios_per_sec": 0, 00:27:23.036 "rw_mbytes_per_sec": 0, 00:27:23.036 "r_mbytes_per_sec": 0, 00:27:23.036 "w_mbytes_per_sec": 0 00:27:23.036 }, 00:27:23.036 "claimed": true, 00:27:23.036 "claim_type": "exclusive_write", 00:27:23.036 "zoned": false, 00:27:23.036 "supported_io_types": { 00:27:23.036 "read": true, 00:27:23.036 "write": true, 00:27:23.036 "unmap": true, 00:27:23.036 "flush": true, 00:27:23.036 "reset": true, 00:27:23.036 "nvme_admin": false, 00:27:23.036 "nvme_io": false, 00:27:23.036 "nvme_io_md": false, 00:27:23.036 "write_zeroes": true, 00:27:23.036 "zcopy": true, 00:27:23.036 "get_zone_info": false, 00:27:23.036 "zone_management": false, 00:27:23.036 "zone_append": false, 00:27:23.036 "compare": false, 00:27:23.036 "compare_and_write": false, 00:27:23.036 "abort": true, 00:27:23.036 "seek_hole": false, 00:27:23.036 "seek_data": false, 00:27:23.036 "copy": true, 00:27:23.036 "nvme_iov_md": false 00:27:23.036 }, 00:27:23.036 "memory_domains": [ 00:27:23.036 { 00:27:23.036 "dma_device_id": "system", 00:27:23.036 "dma_device_type": 1 00:27:23.036 }, 00:27:23.036 { 00:27:23.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.036 "dma_device_type": 2 00:27:23.036 } 00:27:23.036 ], 00:27:23.036 "driver_specific": { 00:27:23.036 "passthru": { 00:27:23.036 "name": "pt1", 00:27:23.036 "base_bdev_name": "malloc1" 00:27:23.036 } 00:27:23.036 } 00:27:23.036 }' 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.036 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:23.303 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:23.562 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:23.562 "name": "pt2", 00:27:23.562 "aliases": [ 00:27:23.562 "00000000-0000-0000-0000-000000000002" 00:27:23.562 ], 00:27:23.562 "product_name": "passthru", 00:27:23.562 "block_size": 4096, 00:27:23.562 "num_blocks": 8192, 00:27:23.562 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:23.562 "assigned_rate_limits": { 00:27:23.562 "rw_ios_per_sec": 0, 00:27:23.562 "rw_mbytes_per_sec": 0, 00:27:23.562 "r_mbytes_per_sec": 0, 00:27:23.562 "w_mbytes_per_sec": 0 00:27:23.562 }, 00:27:23.562 "claimed": true, 00:27:23.562 "claim_type": "exclusive_write", 00:27:23.562 "zoned": false, 00:27:23.562 "supported_io_types": { 00:27:23.562 "read": true, 00:27:23.562 "write": true, 00:27:23.562 "unmap": true, 00:27:23.562 "flush": true, 00:27:23.562 "reset": true, 00:27:23.562 "nvme_admin": false, 00:27:23.562 "nvme_io": false, 00:27:23.562 "nvme_io_md": false, 00:27:23.562 "write_zeroes": true, 00:27:23.562 "zcopy": true, 00:27:23.562 "get_zone_info": false, 00:27:23.562 "zone_management": false, 00:27:23.562 "zone_append": false, 00:27:23.562 "compare": false, 00:27:23.562 "compare_and_write": false, 00:27:23.562 "abort": true, 00:27:23.562 "seek_hole": false, 00:27:23.562 "seek_data": false, 00:27:23.562 "copy": true, 00:27:23.562 "nvme_iov_md": false 00:27:23.562 }, 00:27:23.562 "memory_domains": [ 00:27:23.562 { 00:27:23.562 "dma_device_id": "system", 00:27:23.562 "dma_device_type": 1 00:27:23.562 }, 00:27:23.562 { 00:27:23.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.562 "dma_device_type": 2 00:27:23.562 } 00:27:23.562 ], 00:27:23.562 "driver_specific": { 00:27:23.562 "passthru": { 00:27:23.562 "name": "pt2", 00:27:23.562 "base_bdev_name": "malloc2" 00:27:23.562 } 00:27:23.562 } 00:27:23.562 }' 00:27:23.562 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.562 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.562 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:23.562 07:32:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.562 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.562 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:23.562 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:23.820 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:27:24.079 [2024-07-25 07:32:56.433182] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:24.079 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # '[' 1c81c93a-38c3-4581-9856-6b51a0faad22 '!=' 1c81c93a-38c3-4581-9856-6b51a0faad22 ']' 00:27:24.079 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:27:24.079 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:24.079 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:24.079 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:24.338 [2024-07-25 07:32:56.661578] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.338 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.596 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.596 "name": "raid_bdev1", 00:27:24.596 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:24.596 "strip_size_kb": 0, 00:27:24.596 "state": "online", 00:27:24.596 "raid_level": "raid1", 00:27:24.596 "superblock": true, 00:27:24.596 "num_base_bdevs": 2, 00:27:24.596 "num_base_bdevs_discovered": 1, 00:27:24.596 "num_base_bdevs_operational": 1, 00:27:24.596 "base_bdevs_list": [ 00:27:24.596 { 00:27:24.596 "name": null, 00:27:24.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.596 "is_configured": false, 00:27:24.596 "data_offset": 256, 00:27:24.596 "data_size": 7936 00:27:24.596 }, 00:27:24.596 { 00:27:24.596 "name": "pt2", 00:27:24.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:24.596 "is_configured": true, 00:27:24.596 "data_offset": 256, 00:27:24.596 "data_size": 7936 00:27:24.596 } 00:27:24.596 ] 00:27:24.596 }' 00:27:24.596 07:32:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.596 07:32:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:25.164 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:25.164 [2024-07-25 07:32:57.692279] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:25.164 [2024-07-25 07:32:57.692304] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:25.164 [2024-07-25 07:32:57.692352] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:25.164 [2024-07-25 07:32:57.692389] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:25.164 [2024-07-25 07:32:57.692400] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1606bc0 name raid_bdev1, state offline 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:27:25.422 07:32:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:25.681 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:27:25.681 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:27:25.681 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:27:25.681 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:27:25.681 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@534 -- # i=1 00:27:25.681 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:25.939 [2024-07-25 07:32:58.370030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:25.939 [2024-07-25 07:32:58.370076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:25.940 [2024-07-25 07:32:58.370096] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16086e0 00:27:25.940 [2024-07-25 07:32:58.370107] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:25.940 [2024-07-25 07:32:58.371590] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:25.940 [2024-07-25 07:32:58.371618] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:25.940 [2024-07-25 07:32:58.371675] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:25.940 [2024-07-25 07:32:58.371700] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:25.940 [2024-07-25 07:32:58.371779] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b2770 00:27:25.940 [2024-07-25 07:32:58.371788] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:25.940 [2024-07-25 07:32:58.371944] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16096e0 00:27:25.940 [2024-07-25 07:32:58.372054] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b2770 00:27:25.940 [2024-07-25 07:32:58.372062] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17b2770 00:27:25.940 [2024-07-25 07:32:58.372162] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:25.940 pt2 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.940 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.198 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.198 "name": "raid_bdev1", 00:27:26.199 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:26.199 "strip_size_kb": 0, 00:27:26.199 "state": "online", 00:27:26.199 "raid_level": "raid1", 00:27:26.199 "superblock": true, 00:27:26.199 "num_base_bdevs": 2, 00:27:26.199 "num_base_bdevs_discovered": 1, 00:27:26.199 "num_base_bdevs_operational": 1, 00:27:26.199 "base_bdevs_list": [ 00:27:26.199 { 00:27:26.199 "name": null, 00:27:26.199 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.199 "is_configured": false, 00:27:26.199 "data_offset": 256, 00:27:26.199 "data_size": 7936 00:27:26.199 }, 00:27:26.199 { 00:27:26.199 "name": "pt2", 00:27:26.199 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:26.199 "is_configured": true, 00:27:26.199 "data_offset": 256, 00:27:26.199 "data_size": 7936 00:27:26.199 } 00:27:26.199 ] 00:27:26.199 }' 00:27:26.199 07:32:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.199 07:32:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:26.766 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:27.024 [2024-07-25 07:32:59.384706] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:27.024 [2024-07-25 07:32:59.384730] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:27.024 [2024-07-25 07:32:59.384783] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:27.024 [2024-07-25 07:32:59.384823] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:27.024 [2024-07-25 07:32:59.384833] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b2770 name raid_bdev1, state offline 00:27:27.024 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.024 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:27:27.282 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:27:27.282 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:27:27.282 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:27:27.282 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:27.541 [2024-07-25 07:32:59.841890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:27.541 [2024-07-25 07:32:59.841931] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.541 [2024-07-25 07:32:59.841948] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b1470 00:27:27.541 [2024-07-25 07:32:59.841959] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.541 [2024-07-25 07:32:59.843430] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.541 [2024-07-25 07:32:59.843455] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:27.541 [2024-07-25 07:32:59.843510] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:27.541 [2024-07-25 07:32:59.843533] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:27.541 [2024-07-25 07:32:59.843620] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:27.541 [2024-07-25 07:32:59.843632] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:27.541 [2024-07-25 07:32:59.843643] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b9320 name raid_bdev1, state configuring 00:27:27.541 [2024-07-25 07:32:59.843663] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:27.541 [2024-07-25 07:32:59.843711] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b8a50 00:27:27.541 [2024-07-25 07:32:59.843720] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:27.541 [2024-07-25 07:32:59.843870] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1608ed0 00:27:27.541 [2024-07-25 07:32:59.843981] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b8a50 00:27:27.541 [2024-07-25 07:32:59.843990] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17b8a50 00:27:27.541 [2024-07-25 07:32:59.844073] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:27.541 pt1 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.541 07:32:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.800 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.800 "name": "raid_bdev1", 00:27:27.800 "uuid": "1c81c93a-38c3-4581-9856-6b51a0faad22", 00:27:27.800 "strip_size_kb": 0, 00:27:27.800 "state": "online", 00:27:27.800 "raid_level": "raid1", 00:27:27.800 "superblock": true, 00:27:27.800 "num_base_bdevs": 2, 00:27:27.800 "num_base_bdevs_discovered": 1, 00:27:27.800 "num_base_bdevs_operational": 1, 00:27:27.800 "base_bdevs_list": [ 00:27:27.800 { 00:27:27.800 "name": null, 00:27:27.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.800 "is_configured": false, 00:27:27.800 "data_offset": 256, 00:27:27.800 "data_size": 7936 00:27:27.800 }, 00:27:27.800 { 00:27:27.800 "name": "pt2", 00:27:27.800 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:27.800 "is_configured": true, 00:27:27.800 "data_offset": 256, 00:27:27.800 "data_size": 7936 00:27:27.800 } 00:27:27.800 ] 00:27:27.800 }' 00:27:27.800 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.800 07:33:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:28.364 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:28.364 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:28.364 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:27:28.364 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:28.364 07:33:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:27:28.622 [2024-07-25 07:33:01.093401] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # '[' 1c81c93a-38c3-4581-9856-6b51a0faad22 '!=' 1c81c93a-38c3-4581-9856-6b51a0faad22 ']' 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@578 -- # killprocess 1754679 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # '[' -z 1754679 ']' 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # kill -0 1754679 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # uname 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:28.623 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1754679 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1754679' 00:27:28.881 killing process with pid 1754679 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@969 -- # kill 1754679 00:27:28.881 [2024-07-25 07:33:01.172346] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:28.881 [2024-07-25 07:33:01.172394] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:28.881 [2024-07-25 07:33:01.172432] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:28.881 [2024-07-25 07:33:01.172443] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b8a50 name raid_bdev1, state offline 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@974 -- # wait 1754679 00:27:28.881 [2024-07-25 07:33:01.188270] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@580 -- # return 0 00:27:28.881 00:27:28.881 real 0m14.757s 00:27:28.881 user 0m26.665s 00:27:28.881 sys 0m2.774s 00:27:28.881 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:28.882 07:33:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:28.882 ************************************ 00:27:28.882 END TEST raid_superblock_test_4k 00:27:28.882 ************************************ 00:27:29.140 07:33:01 bdev_raid -- bdev/bdev_raid.sh@980 -- # '[' true = true ']' 00:27:29.140 07:33:01 bdev_raid -- bdev/bdev_raid.sh@981 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:29.140 07:33:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:27:29.140 07:33:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:29.140 07:33:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:29.140 ************************************ 00:27:29.140 START TEST raid_rebuild_test_sb_4k 00:27:29.140 ************************************ 00:27:29.140 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:27:29.140 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:27:29.140 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:27:29.140 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # local verify=true 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # local strip_size 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # local create_arg 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@594 -- # local data_offset 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # raid_pid=1757554 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # waitforlisten 1757554 /var/tmp/spdk-raid.sock 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1757554 ']' 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:29.141 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:29.141 07:33:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:29.141 [2024-07-25 07:33:01.526661] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:27:29.141 [2024-07-25 07:33:01.526717] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757554 ] 00:27:29.141 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:29.141 Zero copy mechanism will not be used. 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:29.141 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:29.141 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:29.141 [2024-07-25 07:33:01.657522] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:29.400 [2024-07-25 07:33:01.746209] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:27:29.400 [2024-07-25 07:33:01.810511] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:29.400 [2024-07-25 07:33:01.810547] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:29.966 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:29.966 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:27:29.966 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:29.966 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:27:30.225 BaseBdev1_malloc 00:27:30.225 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:30.483 [2024-07-25 07:33:02.868040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:30.483 [2024-07-25 07:33:02.868086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:30.483 [2024-07-25 07:33:02.868108] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd7e690 00:27:30.483 [2024-07-25 07:33:02.868119] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:30.483 [2024-07-25 07:33:02.869582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:30.483 [2024-07-25 07:33:02.869607] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:30.483 BaseBdev1 00:27:30.483 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:30.483 07:33:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:27:30.741 BaseBdev2_malloc 00:27:30.741 07:33:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:30.999 [2024-07-25 07:33:03.325555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:30.999 [2024-07-25 07:33:03.325593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:30.999 [2024-07-25 07:33:03.325613] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd7f050 00:27:30.999 [2024-07-25 07:33:03.325625] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:30.999 [2024-07-25 07:33:03.326904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:30.999 [2024-07-25 07:33:03.326930] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:30.999 BaseBdev2 00:27:30.999 07:33:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:27:31.257 spare_malloc 00:27:31.257 07:33:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:31.515 spare_delay 00:27:31.515 07:33:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:31.515 [2024-07-25 07:33:04.031731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:31.515 [2024-07-25 07:33:04.031773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:31.515 [2024-07-25 07:33:04.031793] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe1f810 00:27:31.515 [2024-07-25 07:33:04.031805] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:31.515 [2024-07-25 07:33:04.033202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:31.515 [2024-07-25 07:33:04.033228] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:31.515 spare 00:27:31.515 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:31.772 [2024-07-25 07:33:04.256370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:31.772 [2024-07-25 07:33:04.257553] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:31.772 [2024-07-25 07:33:04.257715] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe1e6c0 00:27:31.772 [2024-07-25 07:33:04.257727] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:31.772 [2024-07-25 07:33:04.257901] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd80780 00:27:31.772 [2024-07-25 07:33:04.258033] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe1e6c0 00:27:31.772 [2024-07-25 07:33:04.258042] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe1e6c0 00:27:31.772 [2024-07-25 07:33:04.258134] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.772 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.030 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.030 "name": "raid_bdev1", 00:27:32.030 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:32.030 "strip_size_kb": 0, 00:27:32.030 "state": "online", 00:27:32.030 "raid_level": "raid1", 00:27:32.030 "superblock": true, 00:27:32.030 "num_base_bdevs": 2, 00:27:32.030 "num_base_bdevs_discovered": 2, 00:27:32.030 "num_base_bdevs_operational": 2, 00:27:32.030 "base_bdevs_list": [ 00:27:32.030 { 00:27:32.030 "name": "BaseBdev1", 00:27:32.030 "uuid": "e059631e-a128-50a0-a666-5a090fe26136", 00:27:32.030 "is_configured": true, 00:27:32.030 "data_offset": 256, 00:27:32.030 "data_size": 7936 00:27:32.030 }, 00:27:32.030 { 00:27:32.030 "name": "BaseBdev2", 00:27:32.030 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:32.030 "is_configured": true, 00:27:32.030 "data_offset": 256, 00:27:32.030 "data_size": 7936 00:27:32.030 } 00:27:32.030 ] 00:27:32.030 }' 00:27:32.030 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.030 07:33:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:32.597 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:32.597 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:27:32.855 [2024-07-25 07:33:05.287263] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:32.856 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:27:32.856 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.856 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:33.114 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:33.373 [2024-07-25 07:33:05.748303] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe2a120 00:27:33.373 /dev/nbd0 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:33.373 1+0 records in 00:27:33.373 1+0 records out 00:27:33.373 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224822 s, 18.2 MB/s 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:27:33.373 07:33:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:34.309 7936+0 records in 00:27:34.309 7936+0 records out 00:27:34.309 32505856 bytes (33 MB, 31 MiB) copied, 0.669005 s, 48.6 MB/s 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:34.309 [2024-07-25 07:33:06.669795] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:34.309 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:34.567 [2024-07-25 07:33:06.898433] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.567 07:33:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.826 07:33:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.826 "name": "raid_bdev1", 00:27:34.826 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:34.826 "strip_size_kb": 0, 00:27:34.826 "state": "online", 00:27:34.826 "raid_level": "raid1", 00:27:34.826 "superblock": true, 00:27:34.826 "num_base_bdevs": 2, 00:27:34.826 "num_base_bdevs_discovered": 1, 00:27:34.826 "num_base_bdevs_operational": 1, 00:27:34.826 "base_bdevs_list": [ 00:27:34.826 { 00:27:34.826 "name": null, 00:27:34.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.826 "is_configured": false, 00:27:34.826 "data_offset": 256, 00:27:34.826 "data_size": 7936 00:27:34.826 }, 00:27:34.826 { 00:27:34.826 "name": "BaseBdev2", 00:27:34.826 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:34.826 "is_configured": true, 00:27:34.826 "data_offset": 256, 00:27:34.826 "data_size": 7936 00:27:34.826 } 00:27:34.826 ] 00:27:34.826 }' 00:27:34.826 07:33:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.826 07:33:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:35.394 07:33:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:35.653 [2024-07-25 07:33:07.941216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:35.653 [2024-07-25 07:33:07.945942] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe2a0c0 00:27:35.653 [2024-07-25 07:33:07.948057] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:35.653 07:33:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.588 07:33:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.847 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:36.847 "name": "raid_bdev1", 00:27:36.847 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:36.847 "strip_size_kb": 0, 00:27:36.847 "state": "online", 00:27:36.847 "raid_level": "raid1", 00:27:36.847 "superblock": true, 00:27:36.847 "num_base_bdevs": 2, 00:27:36.847 "num_base_bdevs_discovered": 2, 00:27:36.847 "num_base_bdevs_operational": 2, 00:27:36.847 "process": { 00:27:36.847 "type": "rebuild", 00:27:36.847 "target": "spare", 00:27:36.847 "progress": { 00:27:36.847 "blocks": 3072, 00:27:36.847 "percent": 38 00:27:36.847 } 00:27:36.847 }, 00:27:36.847 "base_bdevs_list": [ 00:27:36.847 { 00:27:36.847 "name": "spare", 00:27:36.847 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:36.847 "is_configured": true, 00:27:36.847 "data_offset": 256, 00:27:36.847 "data_size": 7936 00:27:36.847 }, 00:27:36.847 { 00:27:36.847 "name": "BaseBdev2", 00:27:36.847 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:36.847 "is_configured": true, 00:27:36.847 "data_offset": 256, 00:27:36.847 "data_size": 7936 00:27:36.847 } 00:27:36.847 ] 00:27:36.847 }' 00:27:36.847 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:36.847 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:36.847 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:36.847 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:36.847 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:37.105 [2024-07-25 07:33:09.494720] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:37.106 [2024-07-25 07:33:09.559741] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:37.106 [2024-07-25 07:33:09.559783] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:37.106 [2024-07-25 07:33:09.559803] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:37.106 [2024-07-25 07:33:09.559811] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.106 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.364 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.364 "name": "raid_bdev1", 00:27:37.364 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:37.364 "strip_size_kb": 0, 00:27:37.364 "state": "online", 00:27:37.364 "raid_level": "raid1", 00:27:37.364 "superblock": true, 00:27:37.364 "num_base_bdevs": 2, 00:27:37.364 "num_base_bdevs_discovered": 1, 00:27:37.364 "num_base_bdevs_operational": 1, 00:27:37.364 "base_bdevs_list": [ 00:27:37.364 { 00:27:37.364 "name": null, 00:27:37.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.364 "is_configured": false, 00:27:37.364 "data_offset": 256, 00:27:37.364 "data_size": 7936 00:27:37.364 }, 00:27:37.364 { 00:27:37.364 "name": "BaseBdev2", 00:27:37.364 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:37.364 "is_configured": true, 00:27:37.364 "data_offset": 256, 00:27:37.364 "data_size": 7936 00:27:37.364 } 00:27:37.364 ] 00:27:37.364 }' 00:27:37.364 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.364 07:33:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.929 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.187 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:38.187 "name": "raid_bdev1", 00:27:38.187 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:38.187 "strip_size_kb": 0, 00:27:38.187 "state": "online", 00:27:38.187 "raid_level": "raid1", 00:27:38.188 "superblock": true, 00:27:38.188 "num_base_bdevs": 2, 00:27:38.188 "num_base_bdevs_discovered": 1, 00:27:38.188 "num_base_bdevs_operational": 1, 00:27:38.188 "base_bdevs_list": [ 00:27:38.188 { 00:27:38.188 "name": null, 00:27:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:38.188 "is_configured": false, 00:27:38.188 "data_offset": 256, 00:27:38.188 "data_size": 7936 00:27:38.188 }, 00:27:38.188 { 00:27:38.188 "name": "BaseBdev2", 00:27:38.188 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:38.188 "is_configured": true, 00:27:38.188 "data_offset": 256, 00:27:38.188 "data_size": 7936 00:27:38.188 } 00:27:38.188 ] 00:27:38.188 }' 00:27:38.188 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:38.188 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:38.188 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:38.188 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:38.188 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:38.446 [2024-07-25 07:33:10.851333] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:38.446 [2024-07-25 07:33:10.856018] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd758f0 00:27:38.446 [2024-07-25 07:33:10.857367] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:38.446 07:33:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@678 -- # sleep 1 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.381 07:33:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:39.640 "name": "raid_bdev1", 00:27:39.640 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:39.640 "strip_size_kb": 0, 00:27:39.640 "state": "online", 00:27:39.640 "raid_level": "raid1", 00:27:39.640 "superblock": true, 00:27:39.640 "num_base_bdevs": 2, 00:27:39.640 "num_base_bdevs_discovered": 2, 00:27:39.640 "num_base_bdevs_operational": 2, 00:27:39.640 "process": { 00:27:39.640 "type": "rebuild", 00:27:39.640 "target": "spare", 00:27:39.640 "progress": { 00:27:39.640 "blocks": 2816, 00:27:39.640 "percent": 35 00:27:39.640 } 00:27:39.640 }, 00:27:39.640 "base_bdevs_list": [ 00:27:39.640 { 00:27:39.640 "name": "spare", 00:27:39.640 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:39.640 "is_configured": true, 00:27:39.640 "data_offset": 256, 00:27:39.640 "data_size": 7936 00:27:39.640 }, 00:27:39.640 { 00:27:39.640 "name": "BaseBdev2", 00:27:39.640 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:39.640 "is_configured": true, 00:27:39.640 "data_offset": 256, 00:27:39.640 "data_size": 7936 00:27:39.640 } 00:27:39.640 ] 00:27:39.640 }' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:27:39.640 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # local timeout=970 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.640 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.898 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:39.898 "name": "raid_bdev1", 00:27:39.898 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:39.898 "strip_size_kb": 0, 00:27:39.898 "state": "online", 00:27:39.898 "raid_level": "raid1", 00:27:39.898 "superblock": true, 00:27:39.898 "num_base_bdevs": 2, 00:27:39.898 "num_base_bdevs_discovered": 2, 00:27:39.898 "num_base_bdevs_operational": 2, 00:27:39.898 "process": { 00:27:39.898 "type": "rebuild", 00:27:39.898 "target": "spare", 00:27:39.898 "progress": { 00:27:39.898 "blocks": 3584, 00:27:39.898 "percent": 45 00:27:39.898 } 00:27:39.898 }, 00:27:39.898 "base_bdevs_list": [ 00:27:39.898 { 00:27:39.898 "name": "spare", 00:27:39.898 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:39.898 "is_configured": true, 00:27:39.898 "data_offset": 256, 00:27:39.898 "data_size": 7936 00:27:39.898 }, 00:27:39.898 { 00:27:39.898 "name": "BaseBdev2", 00:27:39.898 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:39.898 "is_configured": true, 00:27:39.898 "data_offset": 256, 00:27:39.898 "data_size": 7936 00:27:39.898 } 00:27:39.898 ] 00:27:39.898 }' 00:27:39.899 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:39.899 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:39.899 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:40.157 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:40.157 07:33:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.091 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.349 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.349 "name": "raid_bdev1", 00:27:41.349 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:41.349 "strip_size_kb": 0, 00:27:41.349 "state": "online", 00:27:41.349 "raid_level": "raid1", 00:27:41.349 "superblock": true, 00:27:41.349 "num_base_bdevs": 2, 00:27:41.349 "num_base_bdevs_discovered": 2, 00:27:41.349 "num_base_bdevs_operational": 2, 00:27:41.349 "process": { 00:27:41.349 "type": "rebuild", 00:27:41.349 "target": "spare", 00:27:41.349 "progress": { 00:27:41.349 "blocks": 7168, 00:27:41.349 "percent": 90 00:27:41.349 } 00:27:41.349 }, 00:27:41.349 "base_bdevs_list": [ 00:27:41.349 { 00:27:41.349 "name": "spare", 00:27:41.349 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:41.349 "is_configured": true, 00:27:41.349 "data_offset": 256, 00:27:41.349 "data_size": 7936 00:27:41.349 }, 00:27:41.349 { 00:27:41.349 "name": "BaseBdev2", 00:27:41.349 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:41.349 "is_configured": true, 00:27:41.349 "data_offset": 256, 00:27:41.349 "data_size": 7936 00:27:41.349 } 00:27:41.349 ] 00:27:41.349 }' 00:27:41.349 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.349 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:41.349 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:41.349 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:41.349 07:33:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:41.608 [2024-07-25 07:33:13.979988] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:41.608 [2024-07-25 07:33:13.980039] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:41.608 [2024-07-25 07:33:13.980119] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.541 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:42.541 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:42.541 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.541 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:42.541 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:42.542 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.542 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.542 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.542 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:42.542 "name": "raid_bdev1", 00:27:42.542 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:42.542 "strip_size_kb": 0, 00:27:42.542 "state": "online", 00:27:42.542 "raid_level": "raid1", 00:27:42.542 "superblock": true, 00:27:42.542 "num_base_bdevs": 2, 00:27:42.542 "num_base_bdevs_discovered": 2, 00:27:42.542 "num_base_bdevs_operational": 2, 00:27:42.542 "base_bdevs_list": [ 00:27:42.542 { 00:27:42.542 "name": "spare", 00:27:42.542 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:42.542 "is_configured": true, 00:27:42.542 "data_offset": 256, 00:27:42.542 "data_size": 7936 00:27:42.542 }, 00:27:42.542 { 00:27:42.542 "name": "BaseBdev2", 00:27:42.542 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:42.542 "is_configured": true, 00:27:42.542 "data_offset": 256, 00:27:42.542 "data_size": 7936 00:27:42.542 } 00:27:42.542 ] 00:27:42.542 }' 00:27:42.542 07:33:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@724 -- # break 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.542 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.800 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:42.800 "name": "raid_bdev1", 00:27:42.800 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:42.800 "strip_size_kb": 0, 00:27:42.800 "state": "online", 00:27:42.800 "raid_level": "raid1", 00:27:42.800 "superblock": true, 00:27:42.800 "num_base_bdevs": 2, 00:27:42.800 "num_base_bdevs_discovered": 2, 00:27:42.800 "num_base_bdevs_operational": 2, 00:27:42.800 "base_bdevs_list": [ 00:27:42.800 { 00:27:42.800 "name": "spare", 00:27:42.800 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:42.800 "is_configured": true, 00:27:42.800 "data_offset": 256, 00:27:42.800 "data_size": 7936 00:27:42.800 }, 00:27:42.800 { 00:27:42.800 "name": "BaseBdev2", 00:27:42.800 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:42.800 "is_configured": true, 00:27:42.800 "data_offset": 256, 00:27:42.800 "data_size": 7936 00:27:42.800 } 00:27:42.800 ] 00:27:42.800 }' 00:27:42.800 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.058 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.317 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.317 "name": "raid_bdev1", 00:27:43.317 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:43.317 "strip_size_kb": 0, 00:27:43.317 "state": "online", 00:27:43.317 "raid_level": "raid1", 00:27:43.317 "superblock": true, 00:27:43.317 "num_base_bdevs": 2, 00:27:43.317 "num_base_bdevs_discovered": 2, 00:27:43.317 "num_base_bdevs_operational": 2, 00:27:43.317 "base_bdevs_list": [ 00:27:43.317 { 00:27:43.317 "name": "spare", 00:27:43.317 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:43.317 "is_configured": true, 00:27:43.317 "data_offset": 256, 00:27:43.317 "data_size": 7936 00:27:43.317 }, 00:27:43.317 { 00:27:43.317 "name": "BaseBdev2", 00:27:43.317 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:43.317 "is_configured": true, 00:27:43.317 "data_offset": 256, 00:27:43.317 "data_size": 7936 00:27:43.317 } 00:27:43.317 ] 00:27:43.317 }' 00:27:43.317 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.317 07:33:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.884 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:43.884 [2024-07-25 07:33:16.378448] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:43.884 [2024-07-25 07:33:16.378472] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:43.884 [2024-07-25 07:33:16.378526] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:43.884 [2024-07-25 07:33:16.378579] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:43.884 [2024-07-25 07:33:16.378590] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1e6c0 name raid_bdev1, state offline 00:27:43.884 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.884 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # jq length 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:44.143 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:44.402 /dev/nbd0 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:44.402 1+0 records in 00:27:44.402 1+0 records out 00:27:44.402 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241804 s, 16.9 MB/s 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:44.402 07:33:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:44.660 /dev/nbd1 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:44.661 1+0 records in 00:27:44.661 1+0 records out 00:27:44.661 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032807 s, 12.5 MB/s 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:44.661 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:44.919 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:45.177 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:45.435 07:33:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:45.692 [2024-07-25 07:33:18.175290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:45.692 [2024-07-25 07:33:18.175331] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:45.692 [2024-07-25 07:33:18.175350] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2a3e0 00:27:45.692 [2024-07-25 07:33:18.175361] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:45.692 [2024-07-25 07:33:18.176838] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:45.692 [2024-07-25 07:33:18.176867] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:45.692 [2024-07-25 07:33:18.176934] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:45.692 [2024-07-25 07:33:18.176958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:45.692 [2024-07-25 07:33:18.177049] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:45.692 spare 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.692 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.949 [2024-07-25 07:33:18.277354] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe20910 00:27:45.949 [2024-07-25 07:33:18.277369] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:45.949 [2024-07-25 07:33:18.277546] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd811b0 00:27:45.949 [2024-07-25 07:33:18.277681] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe20910 00:27:45.949 [2024-07-25 07:33:18.277691] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe20910 00:27:45.949 [2024-07-25 07:33:18.277787] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:45.949 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.949 "name": "raid_bdev1", 00:27:45.949 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:45.949 "strip_size_kb": 0, 00:27:45.949 "state": "online", 00:27:45.949 "raid_level": "raid1", 00:27:45.949 "superblock": true, 00:27:45.949 "num_base_bdevs": 2, 00:27:45.949 "num_base_bdevs_discovered": 2, 00:27:45.949 "num_base_bdevs_operational": 2, 00:27:45.949 "base_bdevs_list": [ 00:27:45.949 { 00:27:45.949 "name": "spare", 00:27:45.949 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:45.949 "is_configured": true, 00:27:45.950 "data_offset": 256, 00:27:45.950 "data_size": 7936 00:27:45.950 }, 00:27:45.950 { 00:27:45.950 "name": "BaseBdev2", 00:27:45.950 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:45.950 "is_configured": true, 00:27:45.950 "data_offset": 256, 00:27:45.950 "data_size": 7936 00:27:45.950 } 00:27:45.950 ] 00:27:45.950 }' 00:27:45.950 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.950 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.513 07:33:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.771 "name": "raid_bdev1", 00:27:46.771 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:46.771 "strip_size_kb": 0, 00:27:46.771 "state": "online", 00:27:46.771 "raid_level": "raid1", 00:27:46.771 "superblock": true, 00:27:46.771 "num_base_bdevs": 2, 00:27:46.771 "num_base_bdevs_discovered": 2, 00:27:46.771 "num_base_bdevs_operational": 2, 00:27:46.771 "base_bdevs_list": [ 00:27:46.771 { 00:27:46.771 "name": "spare", 00:27:46.771 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:46.771 "is_configured": true, 00:27:46.771 "data_offset": 256, 00:27:46.771 "data_size": 7936 00:27:46.771 }, 00:27:46.771 { 00:27:46.771 "name": "BaseBdev2", 00:27:46.771 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:46.771 "is_configured": true, 00:27:46.771 "data_offset": 256, 00:27:46.771 "data_size": 7936 00:27:46.771 } 00:27:46.771 ] 00:27:46.771 }' 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.771 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:47.030 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:27:47.030 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:47.321 [2024-07-25 07:33:19.699395] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.321 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.580 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:47.580 "name": "raid_bdev1", 00:27:47.580 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:47.580 "strip_size_kb": 0, 00:27:47.580 "state": "online", 00:27:47.580 "raid_level": "raid1", 00:27:47.580 "superblock": true, 00:27:47.580 "num_base_bdevs": 2, 00:27:47.580 "num_base_bdevs_discovered": 1, 00:27:47.580 "num_base_bdevs_operational": 1, 00:27:47.580 "base_bdevs_list": [ 00:27:47.580 { 00:27:47.580 "name": null, 00:27:47.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.580 "is_configured": false, 00:27:47.580 "data_offset": 256, 00:27:47.580 "data_size": 7936 00:27:47.580 }, 00:27:47.580 { 00:27:47.580 "name": "BaseBdev2", 00:27:47.580 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:47.580 "is_configured": true, 00:27:47.580 "data_offset": 256, 00:27:47.580 "data_size": 7936 00:27:47.580 } 00:27:47.580 ] 00:27:47.580 }' 00:27:47.580 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:47.580 07:33:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:48.146 07:33:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:48.404 [2024-07-25 07:33:20.682003] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:48.404 [2024-07-25 07:33:20.682144] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:48.404 [2024-07-25 07:33:20.682160] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:48.404 [2024-07-25 07:33:20.682186] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:48.404 [2024-07-25 07:33:20.686813] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd816d0 00:27:48.404 [2024-07-25 07:33:20.688058] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:48.404 07:33:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # sleep 1 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.337 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.595 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:49.595 "name": "raid_bdev1", 00:27:49.595 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:49.595 "strip_size_kb": 0, 00:27:49.595 "state": "online", 00:27:49.595 "raid_level": "raid1", 00:27:49.595 "superblock": true, 00:27:49.595 "num_base_bdevs": 2, 00:27:49.595 "num_base_bdevs_discovered": 2, 00:27:49.595 "num_base_bdevs_operational": 2, 00:27:49.595 "process": { 00:27:49.595 "type": "rebuild", 00:27:49.595 "target": "spare", 00:27:49.595 "progress": { 00:27:49.595 "blocks": 3072, 00:27:49.595 "percent": 38 00:27:49.595 } 00:27:49.595 }, 00:27:49.595 "base_bdevs_list": [ 00:27:49.595 { 00:27:49.595 "name": "spare", 00:27:49.595 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:49.595 "is_configured": true, 00:27:49.595 "data_offset": 256, 00:27:49.595 "data_size": 7936 00:27:49.595 }, 00:27:49.595 { 00:27:49.595 "name": "BaseBdev2", 00:27:49.595 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:49.595 "is_configured": true, 00:27:49.595 "data_offset": 256, 00:27:49.595 "data_size": 7936 00:27:49.595 } 00:27:49.595 ] 00:27:49.595 }' 00:27:49.595 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:49.595 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:49.595 07:33:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:49.595 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:49.595 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:49.854 [2024-07-25 07:33:22.239715] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:49.854 [2024-07-25 07:33:22.299846] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:49.854 [2024-07-25 07:33:22.299887] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:49.854 [2024-07-25 07:33:22.299901] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:49.854 [2024-07-25 07:33:22.299909] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.854 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.112 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.112 "name": "raid_bdev1", 00:27:50.112 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:50.112 "strip_size_kb": 0, 00:27:50.112 "state": "online", 00:27:50.112 "raid_level": "raid1", 00:27:50.112 "superblock": true, 00:27:50.112 "num_base_bdevs": 2, 00:27:50.112 "num_base_bdevs_discovered": 1, 00:27:50.112 "num_base_bdevs_operational": 1, 00:27:50.112 "base_bdevs_list": [ 00:27:50.112 { 00:27:50.112 "name": null, 00:27:50.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.112 "is_configured": false, 00:27:50.112 "data_offset": 256, 00:27:50.112 "data_size": 7936 00:27:50.112 }, 00:27:50.112 { 00:27:50.112 "name": "BaseBdev2", 00:27:50.112 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:50.112 "is_configured": true, 00:27:50.112 "data_offset": 256, 00:27:50.112 "data_size": 7936 00:27:50.112 } 00:27:50.112 ] 00:27:50.112 }' 00:27:50.112 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.112 07:33:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:50.678 07:33:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:50.936 [2024-07-25 07:33:23.230489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:50.936 [2024-07-25 07:33:23.230535] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:50.936 [2024-07-25 07:33:23.230558] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd75390 00:27:50.936 [2024-07-25 07:33:23.230570] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:50.936 [2024-07-25 07:33:23.230906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:50.936 [2024-07-25 07:33:23.230922] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:50.936 [2024-07-25 07:33:23.230993] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:50.936 [2024-07-25 07:33:23.231004] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:50.936 [2024-07-25 07:33:23.231014] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:50.936 [2024-07-25 07:33:23.231032] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:50.936 [2024-07-25 07:33:23.235706] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd816d0 00:27:50.936 spare 00:27:50.936 [2024-07-25 07:33:23.236949] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:50.936 07:33:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # sleep 1 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.870 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.128 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:52.128 "name": "raid_bdev1", 00:27:52.128 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:52.128 "strip_size_kb": 0, 00:27:52.128 "state": "online", 00:27:52.128 "raid_level": "raid1", 00:27:52.128 "superblock": true, 00:27:52.128 "num_base_bdevs": 2, 00:27:52.128 "num_base_bdevs_discovered": 2, 00:27:52.128 "num_base_bdevs_operational": 2, 00:27:52.128 "process": { 00:27:52.128 "type": "rebuild", 00:27:52.128 "target": "spare", 00:27:52.128 "progress": { 00:27:52.128 "blocks": 3072, 00:27:52.128 "percent": 38 00:27:52.128 } 00:27:52.128 }, 00:27:52.128 "base_bdevs_list": [ 00:27:52.128 { 00:27:52.128 "name": "spare", 00:27:52.128 "uuid": "3a8df59c-bb23-5e44-b01a-f0b614b93048", 00:27:52.128 "is_configured": true, 00:27:52.128 "data_offset": 256, 00:27:52.128 "data_size": 7936 00:27:52.128 }, 00:27:52.128 { 00:27:52.128 "name": "BaseBdev2", 00:27:52.128 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:52.128 "is_configured": true, 00:27:52.128 "data_offset": 256, 00:27:52.128 "data_size": 7936 00:27:52.128 } 00:27:52.128 ] 00:27:52.128 }' 00:27:52.128 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:52.128 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:52.128 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:52.128 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:52.128 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:52.387 [2024-07-25 07:33:24.771975] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:52.387 [2024-07-25 07:33:24.848578] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:52.387 [2024-07-25 07:33:24.848622] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:52.387 [2024-07-25 07:33:24.848637] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:52.387 [2024-07-25 07:33:24.848644] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.387 07:33:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.645 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.645 "name": "raid_bdev1", 00:27:52.645 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:52.645 "strip_size_kb": 0, 00:27:52.645 "state": "online", 00:27:52.645 "raid_level": "raid1", 00:27:52.645 "superblock": true, 00:27:52.645 "num_base_bdevs": 2, 00:27:52.645 "num_base_bdevs_discovered": 1, 00:27:52.645 "num_base_bdevs_operational": 1, 00:27:52.645 "base_bdevs_list": [ 00:27:52.645 { 00:27:52.645 "name": null, 00:27:52.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.645 "is_configured": false, 00:27:52.645 "data_offset": 256, 00:27:52.645 "data_size": 7936 00:27:52.645 }, 00:27:52.645 { 00:27:52.645 "name": "BaseBdev2", 00:27:52.645 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:52.645 "is_configured": true, 00:27:52.645 "data_offset": 256, 00:27:52.645 "data_size": 7936 00:27:52.645 } 00:27:52.645 ] 00:27:52.645 }' 00:27:52.645 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.645 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.211 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:53.469 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:53.469 "name": "raid_bdev1", 00:27:53.469 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:53.469 "strip_size_kb": 0, 00:27:53.469 "state": "online", 00:27:53.469 "raid_level": "raid1", 00:27:53.469 "superblock": true, 00:27:53.469 "num_base_bdevs": 2, 00:27:53.469 "num_base_bdevs_discovered": 1, 00:27:53.469 "num_base_bdevs_operational": 1, 00:27:53.469 "base_bdevs_list": [ 00:27:53.469 { 00:27:53.469 "name": null, 00:27:53.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:53.469 "is_configured": false, 00:27:53.469 "data_offset": 256, 00:27:53.469 "data_size": 7936 00:27:53.469 }, 00:27:53.469 { 00:27:53.469 "name": "BaseBdev2", 00:27:53.469 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:53.469 "is_configured": true, 00:27:53.470 "data_offset": 256, 00:27:53.470 "data_size": 7936 00:27:53.470 } 00:27:53.470 ] 00:27:53.470 }' 00:27:53.470 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:53.470 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:53.470 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:53.470 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:53.470 07:33:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:53.728 07:33:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:53.986 [2024-07-25 07:33:26.372673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:53.986 [2024-07-25 07:33:26.372716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.986 [2024-07-25 07:33:26.372735] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf2f550 00:27:53.986 [2024-07-25 07:33:26.372748] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.986 [2024-07-25 07:33:26.373059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.986 [2024-07-25 07:33:26.373075] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:53.986 [2024-07-25 07:33:26.373130] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:53.987 [2024-07-25 07:33:26.373151] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:53.987 [2024-07-25 07:33:26.373161] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:53.987 BaseBdev1 00:27:53.987 07:33:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@789 -- # sleep 1 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.921 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.922 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.922 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.922 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.922 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.922 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.180 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.180 "name": "raid_bdev1", 00:27:55.180 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:55.180 "strip_size_kb": 0, 00:27:55.180 "state": "online", 00:27:55.180 "raid_level": "raid1", 00:27:55.180 "superblock": true, 00:27:55.180 "num_base_bdevs": 2, 00:27:55.180 "num_base_bdevs_discovered": 1, 00:27:55.180 "num_base_bdevs_operational": 1, 00:27:55.180 "base_bdevs_list": [ 00:27:55.180 { 00:27:55.180 "name": null, 00:27:55.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.180 "is_configured": false, 00:27:55.180 "data_offset": 256, 00:27:55.180 "data_size": 7936 00:27:55.180 }, 00:27:55.180 { 00:27:55.180 "name": "BaseBdev2", 00:27:55.180 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:55.180 "is_configured": true, 00:27:55.180 "data_offset": 256, 00:27:55.180 "data_size": 7936 00:27:55.180 } 00:27:55.180 ] 00:27:55.180 }' 00:27:55.180 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.180 07:33:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.746 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.004 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.004 "name": "raid_bdev1", 00:27:56.004 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:56.004 "strip_size_kb": 0, 00:27:56.004 "state": "online", 00:27:56.004 "raid_level": "raid1", 00:27:56.004 "superblock": true, 00:27:56.004 "num_base_bdevs": 2, 00:27:56.004 "num_base_bdevs_discovered": 1, 00:27:56.004 "num_base_bdevs_operational": 1, 00:27:56.004 "base_bdevs_list": [ 00:27:56.004 { 00:27:56.004 "name": null, 00:27:56.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:56.004 "is_configured": false, 00:27:56.004 "data_offset": 256, 00:27:56.004 "data_size": 7936 00:27:56.004 }, 00:27:56.004 { 00:27:56.004 "name": "BaseBdev2", 00:27:56.004 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:56.004 "is_configured": true, 00:27:56.004 "data_offset": 256, 00:27:56.004 "data_size": 7936 00:27:56.004 } 00:27:56.004 ] 00:27:56.004 }' 00:27:56.005 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:56.005 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:56.005 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # local es=0 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:56.263 [2024-07-25 07:33:28.754960] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:56.263 [2024-07-25 07:33:28.755066] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:56.263 [2024-07-25 07:33:28.755080] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:56.263 request: 00:27:56.263 { 00:27:56.263 "base_bdev": "BaseBdev1", 00:27:56.263 "raid_bdev": "raid_bdev1", 00:27:56.263 "method": "bdev_raid_add_base_bdev", 00:27:56.263 "req_id": 1 00:27:56.263 } 00:27:56.263 Got JSON-RPC error response 00:27:56.263 response: 00:27:56.263 { 00:27:56.263 "code": -22, 00:27:56.263 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:56.263 } 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # es=1 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:56.263 07:33:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@793 -- # sleep 1 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.638 07:33:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.638 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.638 "name": "raid_bdev1", 00:27:57.638 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:57.638 "strip_size_kb": 0, 00:27:57.638 "state": "online", 00:27:57.638 "raid_level": "raid1", 00:27:57.638 "superblock": true, 00:27:57.638 "num_base_bdevs": 2, 00:27:57.638 "num_base_bdevs_discovered": 1, 00:27:57.638 "num_base_bdevs_operational": 1, 00:27:57.638 "base_bdevs_list": [ 00:27:57.638 { 00:27:57.638 "name": null, 00:27:57.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.638 "is_configured": false, 00:27:57.638 "data_offset": 256, 00:27:57.638 "data_size": 7936 00:27:57.638 }, 00:27:57.638 { 00:27:57.638 "name": "BaseBdev2", 00:27:57.638 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:57.638 "is_configured": true, 00:27:57.638 "data_offset": 256, 00:27:57.638 "data_size": 7936 00:27:57.638 } 00:27:57.638 ] 00:27:57.638 }' 00:27:57.638 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.638 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.204 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:58.463 "name": "raid_bdev1", 00:27:58.463 "uuid": "a9ebd845-be8b-4c22-993e-ed18f90fa96e", 00:27:58.463 "strip_size_kb": 0, 00:27:58.463 "state": "online", 00:27:58.463 "raid_level": "raid1", 00:27:58.463 "superblock": true, 00:27:58.463 "num_base_bdevs": 2, 00:27:58.463 "num_base_bdevs_discovered": 1, 00:27:58.463 "num_base_bdevs_operational": 1, 00:27:58.463 "base_bdevs_list": [ 00:27:58.463 { 00:27:58.463 "name": null, 00:27:58.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.463 "is_configured": false, 00:27:58.463 "data_offset": 256, 00:27:58.463 "data_size": 7936 00:27:58.463 }, 00:27:58.463 { 00:27:58.463 "name": "BaseBdev2", 00:27:58.463 "uuid": "32733e3f-9e97-50a1-aac8-8bcc5fc4f34a", 00:27:58.463 "is_configured": true, 00:27:58.463 "data_offset": 256, 00:27:58.463 "data_size": 7936 00:27:58.463 } 00:27:58.463 ] 00:27:58.463 }' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@798 -- # killprocess 1757554 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1757554 ']' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1757554 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1757554 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1757554' 00:27:58.463 killing process with pid 1757554 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1757554 00:27:58.463 Received shutdown signal, test time was about 60.000000 seconds 00:27:58.463 00:27:58.463 Latency(us) 00:27:58.463 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:58.463 =================================================================================================================== 00:27:58.463 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:58.463 [2024-07-25 07:33:30.967114] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:58.463 [2024-07-25 07:33:30.967201] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:58.463 [2024-07-25 07:33:30.967241] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:58.463 [2024-07-25 07:33:30.967253] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe20910 name raid_bdev1, state offline 00:27:58.463 07:33:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1757554 00:27:58.463 [2024-07-25 07:33:30.992339] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:58.722 07:33:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@800 -- # return 0 00:27:58.722 00:27:58.722 real 0m29.722s 00:27:58.722 user 0m45.808s 00:27:58.722 sys 0m4.886s 00:27:58.722 07:33:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:58.722 07:33:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:58.722 ************************************ 00:27:58.722 END TEST raid_rebuild_test_sb_4k 00:27:58.722 ************************************ 00:27:58.722 07:33:31 bdev_raid -- bdev/bdev_raid.sh@984 -- # base_malloc_params='-m 32' 00:27:58.722 07:33:31 bdev_raid -- bdev/bdev_raid.sh@985 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:27:58.722 07:33:31 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:27:58.722 07:33:31 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:58.722 07:33:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:58.981 ************************************ 00:27:58.981 START TEST raid_state_function_test_sb_md_separate 00:27:58.981 ************************************ 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:58.981 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1763372 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1763372' 00:27:58.982 Process raid pid: 1763372 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1763372 /var/tmp/spdk-raid.sock 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1763372 ']' 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:58.982 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:58.982 07:33:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:58.982 [2024-07-25 07:33:31.333163] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:27:58.982 [2024-07-25 07:33:31.333221] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:58.982 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:58.982 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:58.982 [2024-07-25 07:33:31.464150] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:59.241 [2024-07-25 07:33:31.550032] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:27:59.241 [2024-07-25 07:33:31.608865] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:59.241 [2024-07-25 07:33:31.608920] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:59.808 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:59.808 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:27:59.808 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:00.071 [2024-07-25 07:33:32.443076] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:00.071 [2024-07-25 07:33:32.443112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:00.071 [2024-07-25 07:33:32.443122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:00.071 [2024-07-25 07:33:32.443137] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.071 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:00.330 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.330 "name": "Existed_Raid", 00:28:00.330 "uuid": "031fb710-9998-44cf-87bb-58859c4de2c9", 00:28:00.330 "strip_size_kb": 0, 00:28:00.330 "state": "configuring", 00:28:00.330 "raid_level": "raid1", 00:28:00.330 "superblock": true, 00:28:00.330 "num_base_bdevs": 2, 00:28:00.330 "num_base_bdevs_discovered": 0, 00:28:00.330 "num_base_bdevs_operational": 2, 00:28:00.330 "base_bdevs_list": [ 00:28:00.330 { 00:28:00.330 "name": "BaseBdev1", 00:28:00.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.330 "is_configured": false, 00:28:00.330 "data_offset": 0, 00:28:00.330 "data_size": 0 00:28:00.330 }, 00:28:00.330 { 00:28:00.330 "name": "BaseBdev2", 00:28:00.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.330 "is_configured": false, 00:28:00.330 "data_offset": 0, 00:28:00.330 "data_size": 0 00:28:00.330 } 00:28:00.330 ] 00:28:00.330 }' 00:28:00.330 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.330 07:33:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:00.897 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:01.155 [2024-07-25 07:33:33.481680] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:01.155 [2024-07-25 07:33:33.481705] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac7ea0 name Existed_Raid, state configuring 00:28:01.155 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:01.155 [2024-07-25 07:33:33.650133] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:01.155 [2024-07-25 07:33:33.650163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:01.155 [2024-07-25 07:33:33.650172] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:01.155 [2024-07-25 07:33:33.650183] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:01.155 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:28:01.413 [2024-07-25 07:33:33.820558] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:01.413 BaseBdev1 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:01.413 07:33:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:01.671 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:01.671 [ 00:28:01.671 { 00:28:01.671 "name": "BaseBdev1", 00:28:01.671 "aliases": [ 00:28:01.671 "22493f5f-7346-4188-84c0-68dbc9d4b261" 00:28:01.671 ], 00:28:01.671 "product_name": "Malloc disk", 00:28:01.671 "block_size": 4096, 00:28:01.671 "num_blocks": 8192, 00:28:01.671 "uuid": "22493f5f-7346-4188-84c0-68dbc9d4b261", 00:28:01.671 "md_size": 32, 00:28:01.671 "md_interleave": false, 00:28:01.671 "dif_type": 0, 00:28:01.671 "assigned_rate_limits": { 00:28:01.671 "rw_ios_per_sec": 0, 00:28:01.671 "rw_mbytes_per_sec": 0, 00:28:01.671 "r_mbytes_per_sec": 0, 00:28:01.671 "w_mbytes_per_sec": 0 00:28:01.671 }, 00:28:01.671 "claimed": true, 00:28:01.671 "claim_type": "exclusive_write", 00:28:01.671 "zoned": false, 00:28:01.671 "supported_io_types": { 00:28:01.671 "read": true, 00:28:01.671 "write": true, 00:28:01.671 "unmap": true, 00:28:01.671 "flush": true, 00:28:01.671 "reset": true, 00:28:01.671 "nvme_admin": false, 00:28:01.672 "nvme_io": false, 00:28:01.672 "nvme_io_md": false, 00:28:01.672 "write_zeroes": true, 00:28:01.672 "zcopy": true, 00:28:01.672 "get_zone_info": false, 00:28:01.672 "zone_management": false, 00:28:01.672 "zone_append": false, 00:28:01.672 "compare": false, 00:28:01.672 "compare_and_write": false, 00:28:01.672 "abort": true, 00:28:01.672 "seek_hole": false, 00:28:01.672 "seek_data": false, 00:28:01.672 "copy": true, 00:28:01.672 "nvme_iov_md": false 00:28:01.672 }, 00:28:01.672 "memory_domains": [ 00:28:01.672 { 00:28:01.672 "dma_device_id": "system", 00:28:01.672 "dma_device_type": 1 00:28:01.672 }, 00:28:01.672 { 00:28:01.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:01.672 "dma_device_type": 2 00:28:01.672 } 00:28:01.672 ], 00:28:01.672 "driver_specific": {} 00:28:01.672 } 00:28:01.672 ] 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.672 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:01.930 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:01.931 "name": "Existed_Raid", 00:28:01.931 "uuid": "d6585a11-df99-4605-98a2-348987cef8df", 00:28:01.931 "strip_size_kb": 0, 00:28:01.931 "state": "configuring", 00:28:01.931 "raid_level": "raid1", 00:28:01.931 "superblock": true, 00:28:01.931 "num_base_bdevs": 2, 00:28:01.931 "num_base_bdevs_discovered": 1, 00:28:01.931 "num_base_bdevs_operational": 2, 00:28:01.931 "base_bdevs_list": [ 00:28:01.931 { 00:28:01.931 "name": "BaseBdev1", 00:28:01.931 "uuid": "22493f5f-7346-4188-84c0-68dbc9d4b261", 00:28:01.931 "is_configured": true, 00:28:01.931 "data_offset": 256, 00:28:01.931 "data_size": 7936 00:28:01.931 }, 00:28:01.931 { 00:28:01.931 "name": "BaseBdev2", 00:28:01.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:01.931 "is_configured": false, 00:28:01.931 "data_offset": 0, 00:28:01.931 "data_size": 0 00:28:01.931 } 00:28:01.931 ] 00:28:01.931 }' 00:28:01.931 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:01.931 07:33:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:02.865 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:03.124 [2024-07-25 07:33:35.480971] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:03.124 [2024-07-25 07:33:35.481008] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac7790 name Existed_Raid, state configuring 00:28:03.124 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:03.382 [2024-07-25 07:33:35.709604] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:03.382 [2024-07-25 07:33:35.710989] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:03.382 [2024-07-25 07:33:35.711020] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:03.382 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:03.382 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:03.382 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:03.382 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.383 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:03.641 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.641 "name": "Existed_Raid", 00:28:03.641 "uuid": "4b667076-733a-426c-a146-c52453b364fc", 00:28:03.641 "strip_size_kb": 0, 00:28:03.641 "state": "configuring", 00:28:03.641 "raid_level": "raid1", 00:28:03.641 "superblock": true, 00:28:03.641 "num_base_bdevs": 2, 00:28:03.641 "num_base_bdevs_discovered": 1, 00:28:03.641 "num_base_bdevs_operational": 2, 00:28:03.641 "base_bdevs_list": [ 00:28:03.641 { 00:28:03.641 "name": "BaseBdev1", 00:28:03.641 "uuid": "22493f5f-7346-4188-84c0-68dbc9d4b261", 00:28:03.641 "is_configured": true, 00:28:03.641 "data_offset": 256, 00:28:03.641 "data_size": 7936 00:28:03.641 }, 00:28:03.641 { 00:28:03.641 "name": "BaseBdev2", 00:28:03.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.641 "is_configured": false, 00:28:03.641 "data_offset": 0, 00:28:03.641 "data_size": 0 00:28:03.641 } 00:28:03.641 ] 00:28:03.641 }' 00:28:03.641 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.641 07:33:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:04.208 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:28:04.466 [2024-07-25 07:33:36.768116] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:04.466 [2024-07-25 07:33:36.768261] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ac6ed0 00:28:04.466 [2024-07-25 07:33:36.768274] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:04.466 [2024-07-25 07:33:36.768330] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac6910 00:28:04.466 [2024-07-25 07:33:36.768421] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ac6ed0 00:28:04.466 [2024-07-25 07:33:36.768430] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ac6ed0 00:28:04.466 [2024-07-25 07:33:36.768488] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:04.466 BaseBdev2 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:04.466 07:33:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:04.724 [ 00:28:04.724 { 00:28:04.724 "name": "BaseBdev2", 00:28:04.724 "aliases": [ 00:28:04.724 "f1dcbb50-33ef-4a92-a9c5-cf635934a37f" 00:28:04.724 ], 00:28:04.724 "product_name": "Malloc disk", 00:28:04.724 "block_size": 4096, 00:28:04.724 "num_blocks": 8192, 00:28:04.724 "uuid": "f1dcbb50-33ef-4a92-a9c5-cf635934a37f", 00:28:04.724 "md_size": 32, 00:28:04.724 "md_interleave": false, 00:28:04.724 "dif_type": 0, 00:28:04.724 "assigned_rate_limits": { 00:28:04.724 "rw_ios_per_sec": 0, 00:28:04.724 "rw_mbytes_per_sec": 0, 00:28:04.724 "r_mbytes_per_sec": 0, 00:28:04.724 "w_mbytes_per_sec": 0 00:28:04.724 }, 00:28:04.724 "claimed": true, 00:28:04.724 "claim_type": "exclusive_write", 00:28:04.724 "zoned": false, 00:28:04.724 "supported_io_types": { 00:28:04.724 "read": true, 00:28:04.724 "write": true, 00:28:04.724 "unmap": true, 00:28:04.724 "flush": true, 00:28:04.724 "reset": true, 00:28:04.724 "nvme_admin": false, 00:28:04.724 "nvme_io": false, 00:28:04.724 "nvme_io_md": false, 00:28:04.724 "write_zeroes": true, 00:28:04.724 "zcopy": true, 00:28:04.724 "get_zone_info": false, 00:28:04.724 "zone_management": false, 00:28:04.724 "zone_append": false, 00:28:04.724 "compare": false, 00:28:04.724 "compare_and_write": false, 00:28:04.724 "abort": true, 00:28:04.724 "seek_hole": false, 00:28:04.724 "seek_data": false, 00:28:04.724 "copy": true, 00:28:04.724 "nvme_iov_md": false 00:28:04.724 }, 00:28:04.724 "memory_domains": [ 00:28:04.724 { 00:28:04.724 "dma_device_id": "system", 00:28:04.724 "dma_device_type": 1 00:28:04.724 }, 00:28:04.724 { 00:28:04.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.724 "dma_device_type": 2 00:28:04.724 } 00:28:04.724 ], 00:28:04.724 "driver_specific": {} 00:28:04.724 } 00:28:04.724 ] 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:04.724 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:04.725 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.983 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:04.983 "name": "Existed_Raid", 00:28:04.983 "uuid": "4b667076-733a-426c-a146-c52453b364fc", 00:28:04.983 "strip_size_kb": 0, 00:28:04.983 "state": "online", 00:28:04.983 "raid_level": "raid1", 00:28:04.983 "superblock": true, 00:28:04.983 "num_base_bdevs": 2, 00:28:04.983 "num_base_bdevs_discovered": 2, 00:28:04.983 "num_base_bdevs_operational": 2, 00:28:04.983 "base_bdevs_list": [ 00:28:04.983 { 00:28:04.983 "name": "BaseBdev1", 00:28:04.983 "uuid": "22493f5f-7346-4188-84c0-68dbc9d4b261", 00:28:04.983 "is_configured": true, 00:28:04.983 "data_offset": 256, 00:28:04.983 "data_size": 7936 00:28:04.983 }, 00:28:04.983 { 00:28:04.983 "name": "BaseBdev2", 00:28:04.983 "uuid": "f1dcbb50-33ef-4a92-a9c5-cf635934a37f", 00:28:04.983 "is_configured": true, 00:28:04.983 "data_offset": 256, 00:28:04.983 "data_size": 7936 00:28:04.983 } 00:28:04.983 ] 00:28:04.983 }' 00:28:04.983 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:04.983 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:05.549 07:33:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:05.808 [2024-07-25 07:33:38.136009] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:05.808 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:05.808 "name": "Existed_Raid", 00:28:05.808 "aliases": [ 00:28:05.808 "4b667076-733a-426c-a146-c52453b364fc" 00:28:05.808 ], 00:28:05.808 "product_name": "Raid Volume", 00:28:05.808 "block_size": 4096, 00:28:05.808 "num_blocks": 7936, 00:28:05.808 "uuid": "4b667076-733a-426c-a146-c52453b364fc", 00:28:05.808 "md_size": 32, 00:28:05.808 "md_interleave": false, 00:28:05.808 "dif_type": 0, 00:28:05.808 "assigned_rate_limits": { 00:28:05.808 "rw_ios_per_sec": 0, 00:28:05.808 "rw_mbytes_per_sec": 0, 00:28:05.808 "r_mbytes_per_sec": 0, 00:28:05.808 "w_mbytes_per_sec": 0 00:28:05.808 }, 00:28:05.808 "claimed": false, 00:28:05.808 "zoned": false, 00:28:05.808 "supported_io_types": { 00:28:05.808 "read": true, 00:28:05.808 "write": true, 00:28:05.808 "unmap": false, 00:28:05.808 "flush": false, 00:28:05.808 "reset": true, 00:28:05.808 "nvme_admin": false, 00:28:05.808 "nvme_io": false, 00:28:05.808 "nvme_io_md": false, 00:28:05.808 "write_zeroes": true, 00:28:05.808 "zcopy": false, 00:28:05.808 "get_zone_info": false, 00:28:05.808 "zone_management": false, 00:28:05.808 "zone_append": false, 00:28:05.808 "compare": false, 00:28:05.808 "compare_and_write": false, 00:28:05.808 "abort": false, 00:28:05.808 "seek_hole": false, 00:28:05.808 "seek_data": false, 00:28:05.808 "copy": false, 00:28:05.808 "nvme_iov_md": false 00:28:05.808 }, 00:28:05.808 "memory_domains": [ 00:28:05.808 { 00:28:05.808 "dma_device_id": "system", 00:28:05.808 "dma_device_type": 1 00:28:05.808 }, 00:28:05.808 { 00:28:05.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:05.808 "dma_device_type": 2 00:28:05.808 }, 00:28:05.808 { 00:28:05.808 "dma_device_id": "system", 00:28:05.808 "dma_device_type": 1 00:28:05.808 }, 00:28:05.808 { 00:28:05.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:05.808 "dma_device_type": 2 00:28:05.808 } 00:28:05.808 ], 00:28:05.808 "driver_specific": { 00:28:05.808 "raid": { 00:28:05.808 "uuid": "4b667076-733a-426c-a146-c52453b364fc", 00:28:05.808 "strip_size_kb": 0, 00:28:05.808 "state": "online", 00:28:05.808 "raid_level": "raid1", 00:28:05.808 "superblock": true, 00:28:05.808 "num_base_bdevs": 2, 00:28:05.808 "num_base_bdevs_discovered": 2, 00:28:05.808 "num_base_bdevs_operational": 2, 00:28:05.808 "base_bdevs_list": [ 00:28:05.808 { 00:28:05.808 "name": "BaseBdev1", 00:28:05.808 "uuid": "22493f5f-7346-4188-84c0-68dbc9d4b261", 00:28:05.808 "is_configured": true, 00:28:05.808 "data_offset": 256, 00:28:05.808 "data_size": 7936 00:28:05.808 }, 00:28:05.808 { 00:28:05.808 "name": "BaseBdev2", 00:28:05.808 "uuid": "f1dcbb50-33ef-4a92-a9c5-cf635934a37f", 00:28:05.808 "is_configured": true, 00:28:05.808 "data_offset": 256, 00:28:05.808 "data_size": 7936 00:28:05.808 } 00:28:05.808 ] 00:28:05.808 } 00:28:05.808 } 00:28:05.808 }' 00:28:05.808 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:05.808 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:05.808 BaseBdev2' 00:28:05.808 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:05.808 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:05.808 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:06.067 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:06.067 "name": "BaseBdev1", 00:28:06.067 "aliases": [ 00:28:06.067 "22493f5f-7346-4188-84c0-68dbc9d4b261" 00:28:06.067 ], 00:28:06.067 "product_name": "Malloc disk", 00:28:06.067 "block_size": 4096, 00:28:06.067 "num_blocks": 8192, 00:28:06.067 "uuid": "22493f5f-7346-4188-84c0-68dbc9d4b261", 00:28:06.067 "md_size": 32, 00:28:06.067 "md_interleave": false, 00:28:06.067 "dif_type": 0, 00:28:06.067 "assigned_rate_limits": { 00:28:06.067 "rw_ios_per_sec": 0, 00:28:06.067 "rw_mbytes_per_sec": 0, 00:28:06.067 "r_mbytes_per_sec": 0, 00:28:06.067 "w_mbytes_per_sec": 0 00:28:06.067 }, 00:28:06.067 "claimed": true, 00:28:06.067 "claim_type": "exclusive_write", 00:28:06.067 "zoned": false, 00:28:06.067 "supported_io_types": { 00:28:06.067 "read": true, 00:28:06.067 "write": true, 00:28:06.067 "unmap": true, 00:28:06.067 "flush": true, 00:28:06.067 "reset": true, 00:28:06.067 "nvme_admin": false, 00:28:06.067 "nvme_io": false, 00:28:06.067 "nvme_io_md": false, 00:28:06.067 "write_zeroes": true, 00:28:06.067 "zcopy": true, 00:28:06.067 "get_zone_info": false, 00:28:06.067 "zone_management": false, 00:28:06.067 "zone_append": false, 00:28:06.067 "compare": false, 00:28:06.067 "compare_and_write": false, 00:28:06.067 "abort": true, 00:28:06.067 "seek_hole": false, 00:28:06.067 "seek_data": false, 00:28:06.067 "copy": true, 00:28:06.067 "nvme_iov_md": false 00:28:06.067 }, 00:28:06.067 "memory_domains": [ 00:28:06.067 { 00:28:06.067 "dma_device_id": "system", 00:28:06.067 "dma_device_type": 1 00:28:06.067 }, 00:28:06.067 { 00:28:06.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:06.067 "dma_device_type": 2 00:28:06.067 } 00:28:06.067 ], 00:28:06.067 "driver_specific": {} 00:28:06.067 }' 00:28:06.067 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:06.067 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:06.067 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:06.067 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:06.067 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:06.326 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:06.584 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:06.584 "name": "BaseBdev2", 00:28:06.584 "aliases": [ 00:28:06.584 "f1dcbb50-33ef-4a92-a9c5-cf635934a37f" 00:28:06.584 ], 00:28:06.584 "product_name": "Malloc disk", 00:28:06.584 "block_size": 4096, 00:28:06.584 "num_blocks": 8192, 00:28:06.584 "uuid": "f1dcbb50-33ef-4a92-a9c5-cf635934a37f", 00:28:06.584 "md_size": 32, 00:28:06.584 "md_interleave": false, 00:28:06.584 "dif_type": 0, 00:28:06.584 "assigned_rate_limits": { 00:28:06.584 "rw_ios_per_sec": 0, 00:28:06.584 "rw_mbytes_per_sec": 0, 00:28:06.584 "r_mbytes_per_sec": 0, 00:28:06.584 "w_mbytes_per_sec": 0 00:28:06.584 }, 00:28:06.584 "claimed": true, 00:28:06.584 "claim_type": "exclusive_write", 00:28:06.584 "zoned": false, 00:28:06.584 "supported_io_types": { 00:28:06.584 "read": true, 00:28:06.584 "write": true, 00:28:06.584 "unmap": true, 00:28:06.584 "flush": true, 00:28:06.584 "reset": true, 00:28:06.584 "nvme_admin": false, 00:28:06.584 "nvme_io": false, 00:28:06.584 "nvme_io_md": false, 00:28:06.584 "write_zeroes": true, 00:28:06.584 "zcopy": true, 00:28:06.584 "get_zone_info": false, 00:28:06.584 "zone_management": false, 00:28:06.584 "zone_append": false, 00:28:06.584 "compare": false, 00:28:06.584 "compare_and_write": false, 00:28:06.584 "abort": true, 00:28:06.584 "seek_hole": false, 00:28:06.584 "seek_data": false, 00:28:06.584 "copy": true, 00:28:06.584 "nvme_iov_md": false 00:28:06.584 }, 00:28:06.584 "memory_domains": [ 00:28:06.584 { 00:28:06.584 "dma_device_id": "system", 00:28:06.584 "dma_device_type": 1 00:28:06.584 }, 00:28:06.584 { 00:28:06.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:06.584 "dma_device_type": 2 00:28:06.584 } 00:28:06.584 ], 00:28:06.584 "driver_specific": {} 00:28:06.585 }' 00:28:06.585 07:33:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:06.585 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:06.585 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:06.585 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:06.843 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:07.101 [2024-07-25 07:33:39.551625] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.102 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:07.360 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.360 "name": "Existed_Raid", 00:28:07.360 "uuid": "4b667076-733a-426c-a146-c52453b364fc", 00:28:07.360 "strip_size_kb": 0, 00:28:07.360 "state": "online", 00:28:07.360 "raid_level": "raid1", 00:28:07.360 "superblock": true, 00:28:07.360 "num_base_bdevs": 2, 00:28:07.360 "num_base_bdevs_discovered": 1, 00:28:07.360 "num_base_bdevs_operational": 1, 00:28:07.360 "base_bdevs_list": [ 00:28:07.360 { 00:28:07.360 "name": null, 00:28:07.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.360 "is_configured": false, 00:28:07.360 "data_offset": 256, 00:28:07.360 "data_size": 7936 00:28:07.360 }, 00:28:07.360 { 00:28:07.360 "name": "BaseBdev2", 00:28:07.360 "uuid": "f1dcbb50-33ef-4a92-a9c5-cf635934a37f", 00:28:07.360 "is_configured": true, 00:28:07.360 "data_offset": 256, 00:28:07.360 "data_size": 7936 00:28:07.360 } 00:28:07.360 ] 00:28:07.360 }' 00:28:07.360 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.360 07:33:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:07.927 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:07.927 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:07.927 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.927 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:08.186 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:08.186 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:08.186 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:08.445 [2024-07-25 07:33:40.833158] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:08.445 [2024-07-25 07:33:40.833238] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:08.445 [2024-07-25 07:33:40.844254] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:08.445 [2024-07-25 07:33:40.844285] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:08.445 [2024-07-25 07:33:40.844295] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac6ed0 name Existed_Raid, state offline 00:28:08.445 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:08.445 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:08.445 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.445 07:33:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1763372 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1763372 ']' 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1763372 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1763372 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1763372' 00:28:08.704 killing process with pid 1763372 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1763372 00:28:08.704 [2024-07-25 07:33:41.150712] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:08.704 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1763372 00:28:08.704 [2024-07-25 07:33:41.151565] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:08.963 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:08.963 00:28:08.963 real 0m10.074s 00:28:08.963 user 0m17.856s 00:28:08.963 sys 0m1.923s 00:28:08.963 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:08.963 07:33:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:08.963 ************************************ 00:28:08.963 END TEST raid_state_function_test_sb_md_separate 00:28:08.963 ************************************ 00:28:08.963 07:33:41 bdev_raid -- bdev/bdev_raid.sh@986 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:08.963 07:33:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:28:08.963 07:33:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:08.963 07:33:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:08.963 ************************************ 00:28:08.963 START TEST raid_superblock_test_md_separate 00:28:08.963 ************************************ 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@414 -- # local strip_size 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@427 -- # raid_pid=1765278 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@428 -- # waitforlisten 1765278 /var/tmp/spdk-raid.sock 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1765278 ']' 00:28:08.963 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:08.964 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:08.964 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:08.964 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:08.964 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:08.964 07:33:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:08.964 [2024-07-25 07:33:41.487671] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:28:08.964 [2024-07-25 07:33:41.487726] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1765278 ] 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.0 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.1 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.2 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.3 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.4 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.5 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.6 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:01.7 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.0 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.1 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.2 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.3 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.4 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.5 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.6 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3d:02.7 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3f:01.0 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3f:01.1 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.222 EAL: Requested device 0000:3f:01.2 cannot be used 00:28:09.222 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:01.3 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:01.4 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:01.5 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:01.6 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:01.7 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.0 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.1 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.2 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.3 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.4 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.5 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.6 cannot be used 00:28:09.223 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:09.223 EAL: Requested device 0000:3f:02.7 cannot be used 00:28:09.223 [2024-07-25 07:33:41.617517] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:09.223 [2024-07-25 07:33:41.703032] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:28:09.481 [2024-07-25 07:33:41.760835] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:09.481 [2024-07-25 07:33:41.760865] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@864 -- # return 0 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:10.048 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:10.307 malloc1 00:28:10.307 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:10.307 [2024-07-25 07:33:42.806234] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:10.307 [2024-07-25 07:33:42.806277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.307 [2024-07-25 07:33:42.806297] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd5bf0 00:28:10.307 [2024-07-25 07:33:42.806309] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.308 [2024-07-25 07:33:42.807691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.308 [2024-07-25 07:33:42.807718] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:10.308 pt1 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:10.308 07:33:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:10.566 malloc2 00:28:10.566 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:10.825 [2024-07-25 07:33:43.256497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:10.825 [2024-07-25 07:33:43.256538] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.825 [2024-07-25 07:33:43.256555] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2ca60 00:28:10.825 [2024-07-25 07:33:43.256566] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.825 [2024-07-25 07:33:43.257755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.825 [2024-07-25 07:33:43.257780] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:10.825 pt2 00:28:10.825 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:28:10.825 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:10.825 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:11.082 [2024-07-25 07:33:43.485102] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:11.082 [2024-07-25 07:33:43.486262] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:11.082 [2024-07-25 07:33:43.486407] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e2e0a0 00:28:11.082 [2024-07-25 07:33:43.486420] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:11.082 [2024-07-25 07:33:43.486478] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd5850 00:28:11.082 [2024-07-25 07:33:43.486581] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e2e0a0 00:28:11.082 [2024-07-25 07:33:43.486589] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e2e0a0 00:28:11.082 [2024-07-25 07:33:43.486651] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.082 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.340 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.340 "name": "raid_bdev1", 00:28:11.340 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:11.340 "strip_size_kb": 0, 00:28:11.340 "state": "online", 00:28:11.340 "raid_level": "raid1", 00:28:11.340 "superblock": true, 00:28:11.340 "num_base_bdevs": 2, 00:28:11.340 "num_base_bdevs_discovered": 2, 00:28:11.340 "num_base_bdevs_operational": 2, 00:28:11.340 "base_bdevs_list": [ 00:28:11.340 { 00:28:11.340 "name": "pt1", 00:28:11.340 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:11.340 "is_configured": true, 00:28:11.340 "data_offset": 256, 00:28:11.340 "data_size": 7936 00:28:11.340 }, 00:28:11.340 { 00:28:11.340 "name": "pt2", 00:28:11.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:11.340 "is_configured": true, 00:28:11.340 "data_offset": 256, 00:28:11.340 "data_size": 7936 00:28:11.340 } 00:28:11.340 ] 00:28:11.340 }' 00:28:11.340 07:33:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.340 07:33:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:11.913 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:12.172 [2024-07-25 07:33:44.516023] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:12.172 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:12.172 "name": "raid_bdev1", 00:28:12.172 "aliases": [ 00:28:12.172 "c1452626-a750-4e6d-b2c3-44a2e5eeca5c" 00:28:12.172 ], 00:28:12.172 "product_name": "Raid Volume", 00:28:12.172 "block_size": 4096, 00:28:12.172 "num_blocks": 7936, 00:28:12.172 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:12.172 "md_size": 32, 00:28:12.172 "md_interleave": false, 00:28:12.172 "dif_type": 0, 00:28:12.172 "assigned_rate_limits": { 00:28:12.172 "rw_ios_per_sec": 0, 00:28:12.172 "rw_mbytes_per_sec": 0, 00:28:12.172 "r_mbytes_per_sec": 0, 00:28:12.172 "w_mbytes_per_sec": 0 00:28:12.172 }, 00:28:12.172 "claimed": false, 00:28:12.172 "zoned": false, 00:28:12.172 "supported_io_types": { 00:28:12.172 "read": true, 00:28:12.172 "write": true, 00:28:12.172 "unmap": false, 00:28:12.172 "flush": false, 00:28:12.172 "reset": true, 00:28:12.172 "nvme_admin": false, 00:28:12.172 "nvme_io": false, 00:28:12.172 "nvme_io_md": false, 00:28:12.172 "write_zeroes": true, 00:28:12.172 "zcopy": false, 00:28:12.172 "get_zone_info": false, 00:28:12.172 "zone_management": false, 00:28:12.172 "zone_append": false, 00:28:12.172 "compare": false, 00:28:12.172 "compare_and_write": false, 00:28:12.172 "abort": false, 00:28:12.172 "seek_hole": false, 00:28:12.172 "seek_data": false, 00:28:12.172 "copy": false, 00:28:12.172 "nvme_iov_md": false 00:28:12.172 }, 00:28:12.172 "memory_domains": [ 00:28:12.172 { 00:28:12.172 "dma_device_id": "system", 00:28:12.172 "dma_device_type": 1 00:28:12.172 }, 00:28:12.172 { 00:28:12.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:12.172 "dma_device_type": 2 00:28:12.172 }, 00:28:12.172 { 00:28:12.172 "dma_device_id": "system", 00:28:12.172 "dma_device_type": 1 00:28:12.172 }, 00:28:12.172 { 00:28:12.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:12.172 "dma_device_type": 2 00:28:12.172 } 00:28:12.172 ], 00:28:12.172 "driver_specific": { 00:28:12.172 "raid": { 00:28:12.172 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:12.172 "strip_size_kb": 0, 00:28:12.172 "state": "online", 00:28:12.172 "raid_level": "raid1", 00:28:12.172 "superblock": true, 00:28:12.172 "num_base_bdevs": 2, 00:28:12.172 "num_base_bdevs_discovered": 2, 00:28:12.172 "num_base_bdevs_operational": 2, 00:28:12.172 "base_bdevs_list": [ 00:28:12.172 { 00:28:12.172 "name": "pt1", 00:28:12.172 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:12.172 "is_configured": true, 00:28:12.172 "data_offset": 256, 00:28:12.172 "data_size": 7936 00:28:12.172 }, 00:28:12.172 { 00:28:12.172 "name": "pt2", 00:28:12.172 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:12.172 "is_configured": true, 00:28:12.172 "data_offset": 256, 00:28:12.172 "data_size": 7936 00:28:12.172 } 00:28:12.172 ] 00:28:12.172 } 00:28:12.172 } 00:28:12.172 }' 00:28:12.172 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:12.172 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:12.172 pt2' 00:28:12.172 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:12.173 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:12.173 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:12.431 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:12.431 "name": "pt1", 00:28:12.431 "aliases": [ 00:28:12.431 "00000000-0000-0000-0000-000000000001" 00:28:12.431 ], 00:28:12.431 "product_name": "passthru", 00:28:12.431 "block_size": 4096, 00:28:12.431 "num_blocks": 8192, 00:28:12.431 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:12.431 "md_size": 32, 00:28:12.431 "md_interleave": false, 00:28:12.431 "dif_type": 0, 00:28:12.431 "assigned_rate_limits": { 00:28:12.431 "rw_ios_per_sec": 0, 00:28:12.431 "rw_mbytes_per_sec": 0, 00:28:12.431 "r_mbytes_per_sec": 0, 00:28:12.431 "w_mbytes_per_sec": 0 00:28:12.431 }, 00:28:12.431 "claimed": true, 00:28:12.431 "claim_type": "exclusive_write", 00:28:12.431 "zoned": false, 00:28:12.431 "supported_io_types": { 00:28:12.431 "read": true, 00:28:12.431 "write": true, 00:28:12.431 "unmap": true, 00:28:12.431 "flush": true, 00:28:12.431 "reset": true, 00:28:12.431 "nvme_admin": false, 00:28:12.431 "nvme_io": false, 00:28:12.431 "nvme_io_md": false, 00:28:12.431 "write_zeroes": true, 00:28:12.431 "zcopy": true, 00:28:12.431 "get_zone_info": false, 00:28:12.431 "zone_management": false, 00:28:12.431 "zone_append": false, 00:28:12.431 "compare": false, 00:28:12.431 "compare_and_write": false, 00:28:12.431 "abort": true, 00:28:12.431 "seek_hole": false, 00:28:12.431 "seek_data": false, 00:28:12.431 "copy": true, 00:28:12.431 "nvme_iov_md": false 00:28:12.431 }, 00:28:12.431 "memory_domains": [ 00:28:12.431 { 00:28:12.431 "dma_device_id": "system", 00:28:12.431 "dma_device_type": 1 00:28:12.431 }, 00:28:12.431 { 00:28:12.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:12.431 "dma_device_type": 2 00:28:12.431 } 00:28:12.431 ], 00:28:12.431 "driver_specific": { 00:28:12.431 "passthru": { 00:28:12.431 "name": "pt1", 00:28:12.431 "base_bdev_name": "malloc1" 00:28:12.431 } 00:28:12.431 } 00:28:12.431 }' 00:28:12.431 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:12.431 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:12.431 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:12.431 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:12.431 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:12.689 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:12.689 07:33:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:12.689 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:12.946 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:12.946 "name": "pt2", 00:28:12.946 "aliases": [ 00:28:12.946 "00000000-0000-0000-0000-000000000002" 00:28:12.946 ], 00:28:12.946 "product_name": "passthru", 00:28:12.946 "block_size": 4096, 00:28:12.946 "num_blocks": 8192, 00:28:12.946 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:12.946 "md_size": 32, 00:28:12.947 "md_interleave": false, 00:28:12.947 "dif_type": 0, 00:28:12.947 "assigned_rate_limits": { 00:28:12.947 "rw_ios_per_sec": 0, 00:28:12.947 "rw_mbytes_per_sec": 0, 00:28:12.947 "r_mbytes_per_sec": 0, 00:28:12.947 "w_mbytes_per_sec": 0 00:28:12.947 }, 00:28:12.947 "claimed": true, 00:28:12.947 "claim_type": "exclusive_write", 00:28:12.947 "zoned": false, 00:28:12.947 "supported_io_types": { 00:28:12.947 "read": true, 00:28:12.947 "write": true, 00:28:12.947 "unmap": true, 00:28:12.947 "flush": true, 00:28:12.947 "reset": true, 00:28:12.947 "nvme_admin": false, 00:28:12.947 "nvme_io": false, 00:28:12.947 "nvme_io_md": false, 00:28:12.947 "write_zeroes": true, 00:28:12.947 "zcopy": true, 00:28:12.947 "get_zone_info": false, 00:28:12.947 "zone_management": false, 00:28:12.947 "zone_append": false, 00:28:12.947 "compare": false, 00:28:12.947 "compare_and_write": false, 00:28:12.947 "abort": true, 00:28:12.947 "seek_hole": false, 00:28:12.947 "seek_data": false, 00:28:12.947 "copy": true, 00:28:12.947 "nvme_iov_md": false 00:28:12.947 }, 00:28:12.947 "memory_domains": [ 00:28:12.947 { 00:28:12.947 "dma_device_id": "system", 00:28:12.947 "dma_device_type": 1 00:28:12.947 }, 00:28:12.947 { 00:28:12.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:12.947 "dma_device_type": 2 00:28:12.947 } 00:28:12.947 ], 00:28:12.947 "driver_specific": { 00:28:12.947 "passthru": { 00:28:12.947 "name": "pt2", 00:28:12.947 "base_bdev_name": "malloc2" 00:28:12.947 } 00:28:12.947 } 00:28:12.947 }' 00:28:12.947 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:12.947 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:12.947 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:12.947 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:13.204 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:28:13.463 [2024-07-25 07:33:45.915707] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:13.463 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=c1452626-a750-4e6d-b2c3-44a2e5eeca5c 00:28:13.463 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' -z c1452626-a750-4e6d-b2c3-44a2e5eeca5c ']' 00:28:13.463 07:33:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:13.721 [2024-07-25 07:33:46.144073] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:13.721 [2024-07-25 07:33:46.144092] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:13.721 [2024-07-25 07:33:46.144148] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:13.721 [2024-07-25 07:33:46.144197] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:13.721 [2024-07-25 07:33:46.144209] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e2e0a0 name raid_bdev1, state offline 00:28:13.721 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.721 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:28:13.979 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:28:13.979 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:28:13.979 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:28:13.979 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:14.238 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:28:14.238 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:14.497 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:14.497 07:33:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:14.755 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:14.755 [2024-07-25 07:33:47.287066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:14.755 [2024-07-25 07:33:47.288324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:14.755 [2024-07-25 07:33:47.288374] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:14.755 [2024-07-25 07:33:47.288413] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:14.755 [2024-07-25 07:33:47.288432] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.755 [2024-07-25 07:33:47.288441] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e2cc90 name raid_bdev1, state configuring 00:28:15.014 request: 00:28:15.014 { 00:28:15.014 "name": "raid_bdev1", 00:28:15.014 "raid_level": "raid1", 00:28:15.014 "base_bdevs": [ 00:28:15.014 "malloc1", 00:28:15.014 "malloc2" 00:28:15.014 ], 00:28:15.014 "superblock": false, 00:28:15.014 "method": "bdev_raid_create", 00:28:15.014 "req_id": 1 00:28:15.014 } 00:28:15.014 Got JSON-RPC error response 00:28:15.014 response: 00:28:15.014 { 00:28:15.014 "code": -17, 00:28:15.014 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:15.014 } 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # es=1 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:28:15.014 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:15.273 [2024-07-25 07:33:47.744221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:15.273 [2024-07-25 07:33:47.744261] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:15.273 [2024-07-25 07:33:47.744279] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd48a0 00:28:15.273 [2024-07-25 07:33:47.744290] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:15.273 [2024-07-25 07:33:47.745618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:15.273 [2024-07-25 07:33:47.745644] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:15.273 [2024-07-25 07:33:47.745684] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:15.273 [2024-07-25 07:33:47.745713] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:15.273 pt1 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.273 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.531 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.531 "name": "raid_bdev1", 00:28:15.531 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:15.531 "strip_size_kb": 0, 00:28:15.531 "state": "configuring", 00:28:15.531 "raid_level": "raid1", 00:28:15.531 "superblock": true, 00:28:15.531 "num_base_bdevs": 2, 00:28:15.531 "num_base_bdevs_discovered": 1, 00:28:15.531 "num_base_bdevs_operational": 2, 00:28:15.531 "base_bdevs_list": [ 00:28:15.531 { 00:28:15.531 "name": "pt1", 00:28:15.531 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:15.531 "is_configured": true, 00:28:15.531 "data_offset": 256, 00:28:15.531 "data_size": 7936 00:28:15.531 }, 00:28:15.531 { 00:28:15.531 "name": null, 00:28:15.531 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:15.531 "is_configured": false, 00:28:15.531 "data_offset": 256, 00:28:15.531 "data_size": 7936 00:28:15.531 } 00:28:15.531 ] 00:28:15.531 }' 00:28:15.531 07:33:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.531 07:33:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.097 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:28:16.097 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:28:16.097 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:28:16.097 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:16.354 [2024-07-25 07:33:48.767049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:16.354 [2024-07-25 07:33:48.767092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:16.354 [2024-07-25 07:33:48.767109] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd3fc0 00:28:16.354 [2024-07-25 07:33:48.767120] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:16.354 [2024-07-25 07:33:48.767293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:16.354 [2024-07-25 07:33:48.767309] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:16.354 [2024-07-25 07:33:48.767348] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:16.354 [2024-07-25 07:33:48.767365] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:16.354 [2024-07-25 07:33:48.767447] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e0ea00 00:28:16.354 [2024-07-25 07:33:48.767457] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:16.354 [2024-07-25 07:33:48.767515] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd5850 00:28:16.354 [2024-07-25 07:33:48.767610] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e0ea00 00:28:16.354 [2024-07-25 07:33:48.767619] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e0ea00 00:28:16.354 [2024-07-25 07:33:48.767683] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:16.354 pt2 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.354 07:33:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.611 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:16.611 "name": "raid_bdev1", 00:28:16.611 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:16.611 "strip_size_kb": 0, 00:28:16.611 "state": "online", 00:28:16.611 "raid_level": "raid1", 00:28:16.611 "superblock": true, 00:28:16.611 "num_base_bdevs": 2, 00:28:16.611 "num_base_bdevs_discovered": 2, 00:28:16.611 "num_base_bdevs_operational": 2, 00:28:16.611 "base_bdevs_list": [ 00:28:16.611 { 00:28:16.611 "name": "pt1", 00:28:16.611 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:16.611 "is_configured": true, 00:28:16.611 "data_offset": 256, 00:28:16.611 "data_size": 7936 00:28:16.611 }, 00:28:16.611 { 00:28:16.611 "name": "pt2", 00:28:16.611 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:16.611 "is_configured": true, 00:28:16.611 "data_offset": 256, 00:28:16.611 "data_size": 7936 00:28:16.611 } 00:28:16.611 ] 00:28:16.611 }' 00:28:16.611 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:16.611 07:33:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:17.176 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:17.435 [2024-07-25 07:33:49.733963] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:17.435 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:17.435 "name": "raid_bdev1", 00:28:17.435 "aliases": [ 00:28:17.435 "c1452626-a750-4e6d-b2c3-44a2e5eeca5c" 00:28:17.435 ], 00:28:17.435 "product_name": "Raid Volume", 00:28:17.435 "block_size": 4096, 00:28:17.435 "num_blocks": 7936, 00:28:17.435 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:17.435 "md_size": 32, 00:28:17.435 "md_interleave": false, 00:28:17.435 "dif_type": 0, 00:28:17.435 "assigned_rate_limits": { 00:28:17.435 "rw_ios_per_sec": 0, 00:28:17.435 "rw_mbytes_per_sec": 0, 00:28:17.435 "r_mbytes_per_sec": 0, 00:28:17.435 "w_mbytes_per_sec": 0 00:28:17.435 }, 00:28:17.435 "claimed": false, 00:28:17.435 "zoned": false, 00:28:17.435 "supported_io_types": { 00:28:17.435 "read": true, 00:28:17.435 "write": true, 00:28:17.435 "unmap": false, 00:28:17.435 "flush": false, 00:28:17.435 "reset": true, 00:28:17.435 "nvme_admin": false, 00:28:17.435 "nvme_io": false, 00:28:17.435 "nvme_io_md": false, 00:28:17.435 "write_zeroes": true, 00:28:17.435 "zcopy": false, 00:28:17.435 "get_zone_info": false, 00:28:17.435 "zone_management": false, 00:28:17.435 "zone_append": false, 00:28:17.435 "compare": false, 00:28:17.435 "compare_and_write": false, 00:28:17.435 "abort": false, 00:28:17.435 "seek_hole": false, 00:28:17.435 "seek_data": false, 00:28:17.435 "copy": false, 00:28:17.435 "nvme_iov_md": false 00:28:17.435 }, 00:28:17.435 "memory_domains": [ 00:28:17.435 { 00:28:17.435 "dma_device_id": "system", 00:28:17.435 "dma_device_type": 1 00:28:17.435 }, 00:28:17.435 { 00:28:17.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:17.435 "dma_device_type": 2 00:28:17.435 }, 00:28:17.435 { 00:28:17.435 "dma_device_id": "system", 00:28:17.435 "dma_device_type": 1 00:28:17.435 }, 00:28:17.435 { 00:28:17.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:17.435 "dma_device_type": 2 00:28:17.435 } 00:28:17.435 ], 00:28:17.435 "driver_specific": { 00:28:17.435 "raid": { 00:28:17.435 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:17.435 "strip_size_kb": 0, 00:28:17.435 "state": "online", 00:28:17.435 "raid_level": "raid1", 00:28:17.435 "superblock": true, 00:28:17.435 "num_base_bdevs": 2, 00:28:17.435 "num_base_bdevs_discovered": 2, 00:28:17.435 "num_base_bdevs_operational": 2, 00:28:17.435 "base_bdevs_list": [ 00:28:17.435 { 00:28:17.435 "name": "pt1", 00:28:17.435 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:17.435 "is_configured": true, 00:28:17.435 "data_offset": 256, 00:28:17.435 "data_size": 7936 00:28:17.435 }, 00:28:17.435 { 00:28:17.435 "name": "pt2", 00:28:17.435 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:17.435 "is_configured": true, 00:28:17.435 "data_offset": 256, 00:28:17.435 "data_size": 7936 00:28:17.435 } 00:28:17.435 ] 00:28:17.435 } 00:28:17.435 } 00:28:17.435 }' 00:28:17.435 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:17.435 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:17.435 pt2' 00:28:17.435 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:17.435 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:17.435 07:33:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:17.693 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:17.693 "name": "pt1", 00:28:17.693 "aliases": [ 00:28:17.693 "00000000-0000-0000-0000-000000000001" 00:28:17.693 ], 00:28:17.693 "product_name": "passthru", 00:28:17.693 "block_size": 4096, 00:28:17.693 "num_blocks": 8192, 00:28:17.693 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:17.693 "md_size": 32, 00:28:17.693 "md_interleave": false, 00:28:17.693 "dif_type": 0, 00:28:17.693 "assigned_rate_limits": { 00:28:17.693 "rw_ios_per_sec": 0, 00:28:17.693 "rw_mbytes_per_sec": 0, 00:28:17.693 "r_mbytes_per_sec": 0, 00:28:17.693 "w_mbytes_per_sec": 0 00:28:17.693 }, 00:28:17.693 "claimed": true, 00:28:17.693 "claim_type": "exclusive_write", 00:28:17.693 "zoned": false, 00:28:17.693 "supported_io_types": { 00:28:17.693 "read": true, 00:28:17.693 "write": true, 00:28:17.693 "unmap": true, 00:28:17.693 "flush": true, 00:28:17.693 "reset": true, 00:28:17.693 "nvme_admin": false, 00:28:17.693 "nvme_io": false, 00:28:17.693 "nvme_io_md": false, 00:28:17.693 "write_zeroes": true, 00:28:17.693 "zcopy": true, 00:28:17.693 "get_zone_info": false, 00:28:17.693 "zone_management": false, 00:28:17.693 "zone_append": false, 00:28:17.693 "compare": false, 00:28:17.693 "compare_and_write": false, 00:28:17.693 "abort": true, 00:28:17.693 "seek_hole": false, 00:28:17.693 "seek_data": false, 00:28:17.693 "copy": true, 00:28:17.693 "nvme_iov_md": false 00:28:17.693 }, 00:28:17.694 "memory_domains": [ 00:28:17.694 { 00:28:17.694 "dma_device_id": "system", 00:28:17.694 "dma_device_type": 1 00:28:17.694 }, 00:28:17.694 { 00:28:17.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:17.694 "dma_device_type": 2 00:28:17.694 } 00:28:17.694 ], 00:28:17.694 "driver_specific": { 00:28:17.694 "passthru": { 00:28:17.694 "name": "pt1", 00:28:17.694 "base_bdev_name": "malloc1" 00:28:17.694 } 00:28:17.694 } 00:28:17.694 }' 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:17.694 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:17.979 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:18.237 "name": "pt2", 00:28:18.237 "aliases": [ 00:28:18.237 "00000000-0000-0000-0000-000000000002" 00:28:18.237 ], 00:28:18.237 "product_name": "passthru", 00:28:18.237 "block_size": 4096, 00:28:18.237 "num_blocks": 8192, 00:28:18.237 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:18.237 "md_size": 32, 00:28:18.237 "md_interleave": false, 00:28:18.237 "dif_type": 0, 00:28:18.237 "assigned_rate_limits": { 00:28:18.237 "rw_ios_per_sec": 0, 00:28:18.237 "rw_mbytes_per_sec": 0, 00:28:18.237 "r_mbytes_per_sec": 0, 00:28:18.237 "w_mbytes_per_sec": 0 00:28:18.237 }, 00:28:18.237 "claimed": true, 00:28:18.237 "claim_type": "exclusive_write", 00:28:18.237 "zoned": false, 00:28:18.237 "supported_io_types": { 00:28:18.237 "read": true, 00:28:18.237 "write": true, 00:28:18.237 "unmap": true, 00:28:18.237 "flush": true, 00:28:18.237 "reset": true, 00:28:18.237 "nvme_admin": false, 00:28:18.237 "nvme_io": false, 00:28:18.237 "nvme_io_md": false, 00:28:18.237 "write_zeroes": true, 00:28:18.237 "zcopy": true, 00:28:18.237 "get_zone_info": false, 00:28:18.237 "zone_management": false, 00:28:18.237 "zone_append": false, 00:28:18.237 "compare": false, 00:28:18.237 "compare_and_write": false, 00:28:18.237 "abort": true, 00:28:18.237 "seek_hole": false, 00:28:18.237 "seek_data": false, 00:28:18.237 "copy": true, 00:28:18.237 "nvme_iov_md": false 00:28:18.237 }, 00:28:18.237 "memory_domains": [ 00:28:18.237 { 00:28:18.237 "dma_device_id": "system", 00:28:18.237 "dma_device_type": 1 00:28:18.237 }, 00:28:18.237 { 00:28:18.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:18.237 "dma_device_type": 2 00:28:18.237 } 00:28:18.237 ], 00:28:18.237 "driver_specific": { 00:28:18.237 "passthru": { 00:28:18.237 "name": "pt2", 00:28:18.237 "base_bdev_name": "malloc2" 00:28:18.237 } 00:28:18.237 } 00:28:18.237 }' 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:18.237 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:18.496 07:33:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:28:18.755 [2024-07-25 07:33:51.113587] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:18.755 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # '[' c1452626-a750-4e6d-b2c3-44a2e5eeca5c '!=' c1452626-a750-4e6d-b2c3-44a2e5eeca5c ']' 00:28:18.755 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:28:18.755 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:18.755 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:18.755 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:19.013 [2024-07-25 07:33:51.341980] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.014 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.271 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.271 "name": "raid_bdev1", 00:28:19.271 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:19.271 "strip_size_kb": 0, 00:28:19.271 "state": "online", 00:28:19.271 "raid_level": "raid1", 00:28:19.271 "superblock": true, 00:28:19.271 "num_base_bdevs": 2, 00:28:19.271 "num_base_bdevs_discovered": 1, 00:28:19.271 "num_base_bdevs_operational": 1, 00:28:19.271 "base_bdevs_list": [ 00:28:19.271 { 00:28:19.271 "name": null, 00:28:19.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.271 "is_configured": false, 00:28:19.271 "data_offset": 256, 00:28:19.271 "data_size": 7936 00:28:19.271 }, 00:28:19.271 { 00:28:19.271 "name": "pt2", 00:28:19.271 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:19.271 "is_configured": true, 00:28:19.271 "data_offset": 256, 00:28:19.271 "data_size": 7936 00:28:19.271 } 00:28:19.271 ] 00:28:19.271 }' 00:28:19.272 07:33:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.272 07:33:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:19.838 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:19.838 [2024-07-25 07:33:52.364657] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:19.838 [2024-07-25 07:33:52.364679] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:19.838 [2024-07-25 07:33:52.364725] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:19.838 [2024-07-25 07:33:52.364770] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:19.838 [2024-07-25 07:33:52.364782] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e0ea00 name raid_bdev1, state offline 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:28:20.096 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:20.354 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:28:20.354 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:28:20.354 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:28:20.354 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:28:20.354 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@534 -- # i=1 00:28:20.354 07:33:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:20.613 [2024-07-25 07:33:53.038397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:20.613 [2024-07-25 07:33:53.038435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.613 [2024-07-25 07:33:53.038452] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd5e20 00:28:20.613 [2024-07-25 07:33:53.038463] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.613 [2024-07-25 07:33:53.039799] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.613 [2024-07-25 07:33:53.039825] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:20.613 [2024-07-25 07:33:53.039866] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:20.613 [2024-07-25 07:33:53.039892] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:20.613 [2024-07-25 07:33:53.039962] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e0ef50 00:28:20.613 [2024-07-25 07:33:53.039972] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:20.613 [2024-07-25 07:33:53.040024] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd4f50 00:28:20.613 [2024-07-25 07:33:53.040111] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e0ef50 00:28:20.613 [2024-07-25 07:33:53.040120] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e0ef50 00:28:20.613 [2024-07-25 07:33:53.040190] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.613 pt2 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.613 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.871 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.872 "name": "raid_bdev1", 00:28:20.872 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:20.872 "strip_size_kb": 0, 00:28:20.872 "state": "online", 00:28:20.872 "raid_level": "raid1", 00:28:20.872 "superblock": true, 00:28:20.872 "num_base_bdevs": 2, 00:28:20.872 "num_base_bdevs_discovered": 1, 00:28:20.872 "num_base_bdevs_operational": 1, 00:28:20.872 "base_bdevs_list": [ 00:28:20.872 { 00:28:20.872 "name": null, 00:28:20.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.872 "is_configured": false, 00:28:20.872 "data_offset": 256, 00:28:20.872 "data_size": 7936 00:28:20.872 }, 00:28:20.872 { 00:28:20.872 "name": "pt2", 00:28:20.872 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:20.872 "is_configured": true, 00:28:20.872 "data_offset": 256, 00:28:20.872 "data_size": 7936 00:28:20.872 } 00:28:20.872 ] 00:28:20.872 }' 00:28:20.872 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.872 07:33:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:21.437 07:33:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:21.695 [2024-07-25 07:33:54.065154] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:21.695 [2024-07-25 07:33:54.065177] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:21.695 [2024-07-25 07:33:54.065225] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:21.695 [2024-07-25 07:33:54.065268] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:21.695 [2024-07-25 07:33:54.065280] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e0ef50 name raid_bdev1, state offline 00:28:21.695 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.695 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:28:21.954 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:28:21.954 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:28:21.954 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:28:21.954 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:22.212 [2024-07-25 07:33:54.522328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:22.212 [2024-07-25 07:33:54.522370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:22.212 [2024-07-25 07:33:54.522387] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0d0f0 00:28:22.212 [2024-07-25 07:33:54.522399] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:22.212 [2024-07-25 07:33:54.523926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:22.212 [2024-07-25 07:33:54.523955] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:22.212 [2024-07-25 07:33:54.523998] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:22.212 [2024-07-25 07:33:54.524026] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:22.212 [2024-07-25 07:33:54.524111] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:22.212 [2024-07-25 07:33:54.524123] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:22.212 [2024-07-25 07:33:54.524137] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e108b0 name raid_bdev1, state configuring 00:28:22.212 [2024-07-25 07:33:54.524165] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:22.212 [2024-07-25 07:33:54.524214] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e108b0 00:28:22.212 [2024-07-25 07:33:54.524224] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:22.212 [2024-07-25 07:33:54.524278] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd5850 00:28:22.212 [2024-07-25 07:33:54.524368] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e108b0 00:28:22.212 [2024-07-25 07:33:54.524377] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e108b0 00:28:22.212 [2024-07-25 07:33:54.524441] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:22.212 pt1 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:22.212 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:22.213 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:22.213 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.213 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.471 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:22.471 "name": "raid_bdev1", 00:28:22.471 "uuid": "c1452626-a750-4e6d-b2c3-44a2e5eeca5c", 00:28:22.471 "strip_size_kb": 0, 00:28:22.471 "state": "online", 00:28:22.471 "raid_level": "raid1", 00:28:22.471 "superblock": true, 00:28:22.471 "num_base_bdevs": 2, 00:28:22.471 "num_base_bdevs_discovered": 1, 00:28:22.471 "num_base_bdevs_operational": 1, 00:28:22.471 "base_bdevs_list": [ 00:28:22.471 { 00:28:22.471 "name": null, 00:28:22.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:22.471 "is_configured": false, 00:28:22.471 "data_offset": 256, 00:28:22.471 "data_size": 7936 00:28:22.471 }, 00:28:22.471 { 00:28:22.471 "name": "pt2", 00:28:22.471 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:22.471 "is_configured": true, 00:28:22.471 "data_offset": 256, 00:28:22.471 "data_size": 7936 00:28:22.471 } 00:28:22.471 ] 00:28:22.471 }' 00:28:22.471 07:33:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:22.471 07:33:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:23.038 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:23.038 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:23.038 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:28:23.296 [2024-07-25 07:33:55.777853] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # '[' c1452626-a750-4e6d-b2c3-44a2e5eeca5c '!=' c1452626-a750-4e6d-b2c3-44a2e5eeca5c ']' 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@578 -- # killprocess 1765278 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1765278 ']' 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # kill -0 1765278 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # uname 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:23.296 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1765278 00:28:23.557 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:23.557 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:23.557 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1765278' 00:28:23.557 killing process with pid 1765278 00:28:23.557 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@969 -- # kill 1765278 00:28:23.557 [2024-07-25 07:33:55.857450] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:23.557 [2024-07-25 07:33:55.857503] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:23.557 [2024-07-25 07:33:55.857546] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:23.557 [2024-07-25 07:33:55.857556] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e108b0 name raid_bdev1, state offline 00:28:23.557 07:33:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@974 -- # wait 1765278 00:28:23.557 [2024-07-25 07:33:55.876806] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:23.557 07:33:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@580 -- # return 0 00:28:23.557 00:28:23.557 real 0m14.637s 00:28:23.557 user 0m26.500s 00:28:23.557 sys 0m2.757s 00:28:23.557 07:33:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:23.557 07:33:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:23.557 ************************************ 00:28:23.557 END TEST raid_superblock_test_md_separate 00:28:23.557 ************************************ 00:28:23.816 07:33:56 bdev_raid -- bdev/bdev_raid.sh@987 -- # '[' true = true ']' 00:28:23.816 07:33:56 bdev_raid -- bdev/bdev_raid.sh@988 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:23.816 07:33:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:28:23.816 07:33:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:23.816 07:33:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:23.816 ************************************ 00:28:23.816 START TEST raid_rebuild_test_sb_md_separate 00:28:23.816 ************************************ 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # local verify=true 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:28:23.816 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # local strip_size 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # local create_arg 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@594 -- # local data_offset 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # raid_pid=1767976 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # waitforlisten 1767976 /var/tmp/spdk-raid.sock 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1767976 ']' 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:23.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:23.817 07:33:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:23.817 [2024-07-25 07:33:56.214292] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:28:23.817 [2024-07-25 07:33:56.214340] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1767976 ] 00:28:23.817 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:23.817 Zero copy mechanism will not be used. 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.0 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.1 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.2 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.3 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.4 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.5 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.6 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:01.7 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.0 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.1 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.2 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.3 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.4 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.5 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.6 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3d:02.7 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.0 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.1 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.2 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.3 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.4 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.5 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.6 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:01.7 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.0 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.1 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.2 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.3 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.4 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.5 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.6 cannot be used 00:28:23.817 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:23.817 EAL: Requested device 0000:3f:02.7 cannot be used 00:28:23.817 [2024-07-25 07:33:56.331828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:24.076 [2024-07-25 07:33:56.413641] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:28:24.076 [2024-07-25 07:33:56.476496] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:24.076 [2024-07-25 07:33:56.476533] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:24.642 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:24.643 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:28:24.643 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:28:24.643 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:24.901 BaseBdev1_malloc 00:28:24.901 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:25.159 [2024-07-25 07:33:57.498645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:25.159 [2024-07-25 07:33:57.498690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.159 [2024-07-25 07:33:57.498711] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2162080 00:28:25.159 [2024-07-25 07:33:57.498722] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.159 [2024-07-25 07:33:57.500041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.159 [2024-07-25 07:33:57.500067] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:25.159 BaseBdev1 00:28:25.159 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:28:25.159 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:25.417 BaseBdev2_malloc 00:28:25.417 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:25.676 [2024-07-25 07:33:57.956900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:25.676 [2024-07-25 07:33:57.956939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.676 [2024-07-25 07:33:57.956958] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2273f50 00:28:25.676 [2024-07-25 07:33:57.956969] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.676 [2024-07-25 07:33:57.958088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.676 [2024-07-25 07:33:57.958112] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:25.676 BaseBdev2 00:28:25.676 07:33:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:25.676 spare_malloc 00:28:25.676 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:25.934 spare_delay 00:28:25.934 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:26.191 [2024-07-25 07:33:58.635605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:26.191 [2024-07-25 07:33:58.635640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:26.191 [2024-07-25 07:33:58.635659] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2276990 00:28:26.191 [2024-07-25 07:33:58.635670] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:26.192 [2024-07-25 07:33:58.636808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:26.192 [2024-07-25 07:33:58.636832] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:26.192 spare 00:28:26.192 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:26.450 [2024-07-25 07:33:58.856214] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:26.450 [2024-07-25 07:33:58.857273] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:26.450 [2024-07-25 07:33:58.857431] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2279010 00:28:26.450 [2024-07-25 07:33:58.857443] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:26.450 [2024-07-25 07:33:58.857498] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e02f0 00:28:26.450 [2024-07-25 07:33:58.857597] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2279010 00:28:26.450 [2024-07-25 07:33:58.857606] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2279010 00:28:26.450 [2024-07-25 07:33:58.857665] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.450 07:33:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.708 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.708 "name": "raid_bdev1", 00:28:26.708 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:26.708 "strip_size_kb": 0, 00:28:26.708 "state": "online", 00:28:26.708 "raid_level": "raid1", 00:28:26.708 "superblock": true, 00:28:26.708 "num_base_bdevs": 2, 00:28:26.708 "num_base_bdevs_discovered": 2, 00:28:26.708 "num_base_bdevs_operational": 2, 00:28:26.708 "base_bdevs_list": [ 00:28:26.708 { 00:28:26.708 "name": "BaseBdev1", 00:28:26.708 "uuid": "bd5653f7-86fa-5a80-ae6b-77d3892abd41", 00:28:26.708 "is_configured": true, 00:28:26.708 "data_offset": 256, 00:28:26.708 "data_size": 7936 00:28:26.708 }, 00:28:26.708 { 00:28:26.708 "name": "BaseBdev2", 00:28:26.708 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:26.708 "is_configured": true, 00:28:26.708 "data_offset": 256, 00:28:26.708 "data_size": 7936 00:28:26.708 } 00:28:26.708 ] 00:28:26.708 }' 00:28:26.708 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.708 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:27.274 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:27.274 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:28:27.533 [2024-07-25 07:33:59.875103] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:27.533 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:28:27.533 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.533 07:33:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:27.791 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:28.049 [2024-07-25 07:34:00.332114] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2279c20 00:28:28.049 /dev/nbd0 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:28.049 1+0 records in 00:28:28.049 1+0 records out 00:28:28.049 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252606 s, 16.2 MB/s 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:28:28.049 07:34:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:28.614 7936+0 records in 00:28:28.614 7936+0 records out 00:28:28.614 32505856 bytes (33 MB, 31 MiB) copied, 0.694214 s, 46.8 MB/s 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:28.614 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:28.872 [2024-07-25 07:34:01.331794] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:28.872 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:29.129 [2024-07-25 07:34:01.536376] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:29.129 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:29.129 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:29.129 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:29.129 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:29.129 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:29.129 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:29.130 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.130 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.130 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.130 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.130 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.130 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.387 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:29.387 "name": "raid_bdev1", 00:28:29.387 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:29.387 "strip_size_kb": 0, 00:28:29.387 "state": "online", 00:28:29.387 "raid_level": "raid1", 00:28:29.387 "superblock": true, 00:28:29.387 "num_base_bdevs": 2, 00:28:29.387 "num_base_bdevs_discovered": 1, 00:28:29.387 "num_base_bdevs_operational": 1, 00:28:29.387 "base_bdevs_list": [ 00:28:29.387 { 00:28:29.387 "name": null, 00:28:29.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.387 "is_configured": false, 00:28:29.387 "data_offset": 256, 00:28:29.387 "data_size": 7936 00:28:29.387 }, 00:28:29.387 { 00:28:29.387 "name": "BaseBdev2", 00:28:29.387 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:29.387 "is_configured": true, 00:28:29.387 "data_offset": 256, 00:28:29.387 "data_size": 7936 00:28:29.387 } 00:28:29.387 ] 00:28:29.387 }' 00:28:29.387 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:29.387 07:34:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:29.954 07:34:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:30.212 [2024-07-25 07:34:02.555059] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:30.212 [2024-07-25 07:34:02.557266] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2279a80 00:28:30.212 [2024-07-25 07:34:02.559300] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:30.212 07:34:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.146 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.403 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:31.403 "name": "raid_bdev1", 00:28:31.403 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:31.403 "strip_size_kb": 0, 00:28:31.403 "state": "online", 00:28:31.403 "raid_level": "raid1", 00:28:31.403 "superblock": true, 00:28:31.403 "num_base_bdevs": 2, 00:28:31.403 "num_base_bdevs_discovered": 2, 00:28:31.404 "num_base_bdevs_operational": 2, 00:28:31.404 "process": { 00:28:31.404 "type": "rebuild", 00:28:31.404 "target": "spare", 00:28:31.404 "progress": { 00:28:31.404 "blocks": 3072, 00:28:31.404 "percent": 38 00:28:31.404 } 00:28:31.404 }, 00:28:31.404 "base_bdevs_list": [ 00:28:31.404 { 00:28:31.404 "name": "spare", 00:28:31.404 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:31.404 "is_configured": true, 00:28:31.404 "data_offset": 256, 00:28:31.404 "data_size": 7936 00:28:31.404 }, 00:28:31.404 { 00:28:31.404 "name": "BaseBdev2", 00:28:31.404 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:31.404 "is_configured": true, 00:28:31.404 "data_offset": 256, 00:28:31.404 "data_size": 7936 00:28:31.404 } 00:28:31.404 ] 00:28:31.404 }' 00:28:31.404 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:31.404 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:31.404 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:31.404 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:31.404 07:34:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:31.661 [2024-07-25 07:34:04.116017] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:31.662 [2024-07-25 07:34:04.171209] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:31.662 [2024-07-25 07:34:04.171249] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:31.662 [2024-07-25 07:34:04.171263] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:31.662 [2024-07-25 07:34:04.171271] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:31.662 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:31.662 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:31.662 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:31.662 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.662 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.662 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.920 "name": "raid_bdev1", 00:28:31.920 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:31.920 "strip_size_kb": 0, 00:28:31.920 "state": "online", 00:28:31.920 "raid_level": "raid1", 00:28:31.920 "superblock": true, 00:28:31.920 "num_base_bdevs": 2, 00:28:31.920 "num_base_bdevs_discovered": 1, 00:28:31.920 "num_base_bdevs_operational": 1, 00:28:31.920 "base_bdevs_list": [ 00:28:31.920 { 00:28:31.920 "name": null, 00:28:31.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.920 "is_configured": false, 00:28:31.920 "data_offset": 256, 00:28:31.920 "data_size": 7936 00:28:31.920 }, 00:28:31.920 { 00:28:31.920 "name": "BaseBdev2", 00:28:31.920 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:31.920 "is_configured": true, 00:28:31.920 "data_offset": 256, 00:28:31.920 "data_size": 7936 00:28:31.920 } 00:28:31.920 ] 00:28:31.920 }' 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.920 07:34:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.487 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:32.745 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:32.745 "name": "raid_bdev1", 00:28:32.745 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:32.745 "strip_size_kb": 0, 00:28:32.745 "state": "online", 00:28:32.745 "raid_level": "raid1", 00:28:32.745 "superblock": true, 00:28:32.745 "num_base_bdevs": 2, 00:28:32.745 "num_base_bdevs_discovered": 1, 00:28:32.745 "num_base_bdevs_operational": 1, 00:28:32.745 "base_bdevs_list": [ 00:28:32.745 { 00:28:32.745 "name": null, 00:28:32.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:32.745 "is_configured": false, 00:28:32.745 "data_offset": 256, 00:28:32.745 "data_size": 7936 00:28:32.745 }, 00:28:32.745 { 00:28:32.745 "name": "BaseBdev2", 00:28:32.745 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:32.745 "is_configured": true, 00:28:32.745 "data_offset": 256, 00:28:32.745 "data_size": 7936 00:28:32.745 } 00:28:32.745 ] 00:28:32.745 }' 00:28:32.745 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:33.004 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:33.004 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:33.004 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:33.004 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:33.004 [2024-07-25 07:34:05.512925] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:33.004 [2024-07-25 07:34:05.515113] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x227bba0 00:28:33.004 [2024-07-25 07:34:05.516470] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:33.004 07:34:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@678 -- # sleep 1 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:34.380 "name": "raid_bdev1", 00:28:34.380 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:34.380 "strip_size_kb": 0, 00:28:34.380 "state": "online", 00:28:34.380 "raid_level": "raid1", 00:28:34.380 "superblock": true, 00:28:34.380 "num_base_bdevs": 2, 00:28:34.380 "num_base_bdevs_discovered": 2, 00:28:34.380 "num_base_bdevs_operational": 2, 00:28:34.380 "process": { 00:28:34.380 "type": "rebuild", 00:28:34.380 "target": "spare", 00:28:34.380 "progress": { 00:28:34.380 "blocks": 3072, 00:28:34.380 "percent": 38 00:28:34.380 } 00:28:34.380 }, 00:28:34.380 "base_bdevs_list": [ 00:28:34.380 { 00:28:34.380 "name": "spare", 00:28:34.380 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:34.380 "is_configured": true, 00:28:34.380 "data_offset": 256, 00:28:34.380 "data_size": 7936 00:28:34.380 }, 00:28:34.380 { 00:28:34.380 "name": "BaseBdev2", 00:28:34.380 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:34.380 "is_configured": true, 00:28:34.380 "data_offset": 256, 00:28:34.380 "data_size": 7936 00:28:34.380 } 00:28:34.380 ] 00:28:34.380 }' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:28:34.380 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # local timeout=1024 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.380 07:34:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.638 07:34:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:34.638 "name": "raid_bdev1", 00:28:34.638 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:34.638 "strip_size_kb": 0, 00:28:34.638 "state": "online", 00:28:34.638 "raid_level": "raid1", 00:28:34.638 "superblock": true, 00:28:34.638 "num_base_bdevs": 2, 00:28:34.638 "num_base_bdevs_discovered": 2, 00:28:34.638 "num_base_bdevs_operational": 2, 00:28:34.638 "process": { 00:28:34.638 "type": "rebuild", 00:28:34.638 "target": "spare", 00:28:34.638 "progress": { 00:28:34.638 "blocks": 3840, 00:28:34.638 "percent": 48 00:28:34.638 } 00:28:34.638 }, 00:28:34.638 "base_bdevs_list": [ 00:28:34.638 { 00:28:34.638 "name": "spare", 00:28:34.638 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:34.638 "is_configured": true, 00:28:34.638 "data_offset": 256, 00:28:34.638 "data_size": 7936 00:28:34.638 }, 00:28:34.638 { 00:28:34.638 "name": "BaseBdev2", 00:28:34.638 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:34.638 "is_configured": true, 00:28:34.638 "data_offset": 256, 00:28:34.638 "data_size": 7936 00:28:34.638 } 00:28:34.638 ] 00:28:34.638 }' 00:28:34.638 07:34:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:34.638 07:34:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:34.638 07:34:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:34.896 07:34:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:34.896 07:34:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.848 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.123 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:36.123 "name": "raid_bdev1", 00:28:36.123 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:36.123 "strip_size_kb": 0, 00:28:36.123 "state": "online", 00:28:36.123 "raid_level": "raid1", 00:28:36.123 "superblock": true, 00:28:36.123 "num_base_bdevs": 2, 00:28:36.123 "num_base_bdevs_discovered": 2, 00:28:36.123 "num_base_bdevs_operational": 2, 00:28:36.123 "process": { 00:28:36.123 "type": "rebuild", 00:28:36.123 "target": "spare", 00:28:36.123 "progress": { 00:28:36.123 "blocks": 7168, 00:28:36.123 "percent": 90 00:28:36.123 } 00:28:36.123 }, 00:28:36.123 "base_bdevs_list": [ 00:28:36.123 { 00:28:36.123 "name": "spare", 00:28:36.123 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:36.123 "is_configured": true, 00:28:36.123 "data_offset": 256, 00:28:36.123 "data_size": 7936 00:28:36.123 }, 00:28:36.123 { 00:28:36.123 "name": "BaseBdev2", 00:28:36.123 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:36.123 "is_configured": true, 00:28:36.123 "data_offset": 256, 00:28:36.123 "data_size": 7936 00:28:36.123 } 00:28:36.123 ] 00:28:36.123 }' 00:28:36.123 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:36.123 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:36.123 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:36.123 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:36.123 07:34:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:36.123 [2024-07-25 07:34:08.639181] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:36.123 [2024-07-25 07:34:08.639235] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:36.123 [2024-07-25 07:34:08.639311] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.057 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:37.315 "name": "raid_bdev1", 00:28:37.315 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:37.315 "strip_size_kb": 0, 00:28:37.315 "state": "online", 00:28:37.315 "raid_level": "raid1", 00:28:37.315 "superblock": true, 00:28:37.315 "num_base_bdevs": 2, 00:28:37.315 "num_base_bdevs_discovered": 2, 00:28:37.315 "num_base_bdevs_operational": 2, 00:28:37.315 "base_bdevs_list": [ 00:28:37.315 { 00:28:37.315 "name": "spare", 00:28:37.315 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:37.315 "is_configured": true, 00:28:37.315 "data_offset": 256, 00:28:37.315 "data_size": 7936 00:28:37.315 }, 00:28:37.315 { 00:28:37.315 "name": "BaseBdev2", 00:28:37.315 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:37.315 "is_configured": true, 00:28:37.315 "data_offset": 256, 00:28:37.315 "data_size": 7936 00:28:37.315 } 00:28:37.315 ] 00:28:37.315 }' 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@724 -- # break 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.315 07:34:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.573 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:37.573 "name": "raid_bdev1", 00:28:37.573 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:37.573 "strip_size_kb": 0, 00:28:37.573 "state": "online", 00:28:37.573 "raid_level": "raid1", 00:28:37.573 "superblock": true, 00:28:37.573 "num_base_bdevs": 2, 00:28:37.573 "num_base_bdevs_discovered": 2, 00:28:37.573 "num_base_bdevs_operational": 2, 00:28:37.573 "base_bdevs_list": [ 00:28:37.573 { 00:28:37.573 "name": "spare", 00:28:37.573 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:37.573 "is_configured": true, 00:28:37.573 "data_offset": 256, 00:28:37.573 "data_size": 7936 00:28:37.573 }, 00:28:37.573 { 00:28:37.573 "name": "BaseBdev2", 00:28:37.573 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:37.574 "is_configured": true, 00:28:37.574 "data_offset": 256, 00:28:37.574 "data_size": 7936 00:28:37.574 } 00:28:37.574 ] 00:28:37.574 }' 00:28:37.574 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:37.574 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:37.574 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.831 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:37.831 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:37.832 "name": "raid_bdev1", 00:28:37.832 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:37.832 "strip_size_kb": 0, 00:28:37.832 "state": "online", 00:28:37.832 "raid_level": "raid1", 00:28:37.832 "superblock": true, 00:28:37.832 "num_base_bdevs": 2, 00:28:37.832 "num_base_bdevs_discovered": 2, 00:28:37.832 "num_base_bdevs_operational": 2, 00:28:37.832 "base_bdevs_list": [ 00:28:37.832 { 00:28:37.832 "name": "spare", 00:28:37.832 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:37.832 "is_configured": true, 00:28:37.832 "data_offset": 256, 00:28:37.832 "data_size": 7936 00:28:37.832 }, 00:28:37.832 { 00:28:37.832 "name": "BaseBdev2", 00:28:37.832 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:37.832 "is_configured": true, 00:28:37.832 "data_offset": 256, 00:28:37.832 "data_size": 7936 00:28:37.832 } 00:28:37.832 ] 00:28:37.832 }' 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:37.832 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:38.398 07:34:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:38.655 [2024-07-25 07:34:11.113883] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:38.655 [2024-07-25 07:34:11.113909] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:38.655 [2024-07-25 07:34:11.113963] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:38.655 [2024-07-25 07:34:11.114014] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:38.655 [2024-07-25 07:34:11.114031] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2279010 name raid_bdev1, state offline 00:28:38.655 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.655 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # jq length 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:38.913 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:39.171 /dev/nbd0 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:39.171 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:39.172 1+0 records in 00:28:39.172 1+0 records out 00:28:39.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277389 s, 14.8 MB/s 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:39.172 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:39.431 /dev/nbd1 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:39.431 1+0 records in 00:28:39.431 1+0 records out 00:28:39.431 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030529 s, 13.4 MB/s 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:39.431 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:39.689 07:34:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:39.947 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:40.206 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:40.464 [2024-07-25 07:34:12.940985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:40.464 [2024-07-25 07:34:12.941029] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:40.464 [2024-07-25 07:34:12.941051] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e0dc0 00:28:40.464 [2024-07-25 07:34:12.941062] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:40.464 [2024-07-25 07:34:12.942438] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:40.464 [2024-07-25 07:34:12.942466] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:40.464 [2024-07-25 07:34:12.942522] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:40.464 [2024-07-25 07:34:12.942548] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:40.464 [2024-07-25 07:34:12.942635] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:40.464 spare 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.464 07:34:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.722 [2024-07-25 07:34:13.042939] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x227a050 00:28:40.722 [2024-07-25 07:34:13.042953] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:40.722 [2024-07-25 07:34:13.043018] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21e6aa0 00:28:40.722 [2024-07-25 07:34:13.043127] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x227a050 00:28:40.722 [2024-07-25 07:34:13.043137] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x227a050 00:28:40.722 [2024-07-25 07:34:13.043215] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:40.722 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:40.722 "name": "raid_bdev1", 00:28:40.722 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:40.722 "strip_size_kb": 0, 00:28:40.722 "state": "online", 00:28:40.722 "raid_level": "raid1", 00:28:40.722 "superblock": true, 00:28:40.722 "num_base_bdevs": 2, 00:28:40.722 "num_base_bdevs_discovered": 2, 00:28:40.722 "num_base_bdevs_operational": 2, 00:28:40.722 "base_bdevs_list": [ 00:28:40.722 { 00:28:40.722 "name": "spare", 00:28:40.722 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:40.722 "is_configured": true, 00:28:40.722 "data_offset": 256, 00:28:40.722 "data_size": 7936 00:28:40.722 }, 00:28:40.722 { 00:28:40.722 "name": "BaseBdev2", 00:28:40.722 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:40.722 "is_configured": true, 00:28:40.722 "data_offset": 256, 00:28:40.722 "data_size": 7936 00:28:40.722 } 00:28:40.722 ] 00:28:40.722 }' 00:28:40.722 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:40.722 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.287 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.545 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:41.545 "name": "raid_bdev1", 00:28:41.545 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:41.545 "strip_size_kb": 0, 00:28:41.545 "state": "online", 00:28:41.545 "raid_level": "raid1", 00:28:41.545 "superblock": true, 00:28:41.545 "num_base_bdevs": 2, 00:28:41.545 "num_base_bdevs_discovered": 2, 00:28:41.545 "num_base_bdevs_operational": 2, 00:28:41.545 "base_bdevs_list": [ 00:28:41.545 { 00:28:41.545 "name": "spare", 00:28:41.545 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:41.545 "is_configured": true, 00:28:41.545 "data_offset": 256, 00:28:41.545 "data_size": 7936 00:28:41.545 }, 00:28:41.545 { 00:28:41.545 "name": "BaseBdev2", 00:28:41.545 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:41.545 "is_configured": true, 00:28:41.545 "data_offset": 256, 00:28:41.545 "data_size": 7936 00:28:41.545 } 00:28:41.545 ] 00:28:41.545 }' 00:28:41.545 07:34:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:41.545 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:41.545 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:41.802 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:41.802 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.803 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:41.803 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:28:41.803 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:42.060 [2024-07-25 07:34:14.521321] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.060 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:42.317 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:42.317 "name": "raid_bdev1", 00:28:42.317 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:42.317 "strip_size_kb": 0, 00:28:42.317 "state": "online", 00:28:42.317 "raid_level": "raid1", 00:28:42.317 "superblock": true, 00:28:42.317 "num_base_bdevs": 2, 00:28:42.317 "num_base_bdevs_discovered": 1, 00:28:42.317 "num_base_bdevs_operational": 1, 00:28:42.317 "base_bdevs_list": [ 00:28:42.317 { 00:28:42.317 "name": null, 00:28:42.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:42.318 "is_configured": false, 00:28:42.318 "data_offset": 256, 00:28:42.318 "data_size": 7936 00:28:42.318 }, 00:28:42.318 { 00:28:42.318 "name": "BaseBdev2", 00:28:42.318 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:42.318 "is_configured": true, 00:28:42.318 "data_offset": 256, 00:28:42.318 "data_size": 7936 00:28:42.318 } 00:28:42.318 ] 00:28:42.318 }' 00:28:42.318 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:42.318 07:34:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:42.883 07:34:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:43.141 [2024-07-25 07:34:15.560077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:43.141 [2024-07-25 07:34:15.560223] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:43.141 [2024-07-25 07:34:15.560239] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:43.141 [2024-07-25 07:34:15.560265] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:43.141 [2024-07-25 07:34:15.562342] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21e6a70 00:28:43.141 [2024-07-25 07:34:15.564514] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:43.141 07:34:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # sleep 1 00:28:44.075 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:44.075 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:44.075 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:44.075 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:44.076 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:44.076 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.076 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.333 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:44.333 "name": "raid_bdev1", 00:28:44.333 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:44.333 "strip_size_kb": 0, 00:28:44.333 "state": "online", 00:28:44.333 "raid_level": "raid1", 00:28:44.333 "superblock": true, 00:28:44.333 "num_base_bdevs": 2, 00:28:44.333 "num_base_bdevs_discovered": 2, 00:28:44.333 "num_base_bdevs_operational": 2, 00:28:44.333 "process": { 00:28:44.333 "type": "rebuild", 00:28:44.333 "target": "spare", 00:28:44.333 "progress": { 00:28:44.333 "blocks": 3072, 00:28:44.333 "percent": 38 00:28:44.333 } 00:28:44.333 }, 00:28:44.333 "base_bdevs_list": [ 00:28:44.333 { 00:28:44.333 "name": "spare", 00:28:44.333 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:44.333 "is_configured": true, 00:28:44.333 "data_offset": 256, 00:28:44.334 "data_size": 7936 00:28:44.334 }, 00:28:44.334 { 00:28:44.334 "name": "BaseBdev2", 00:28:44.334 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:44.334 "is_configured": true, 00:28:44.334 "data_offset": 256, 00:28:44.334 "data_size": 7936 00:28:44.334 } 00:28:44.334 ] 00:28:44.334 }' 00:28:44.334 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:44.334 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:44.334 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:44.591 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:44.591 07:34:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:44.591 [2024-07-25 07:34:17.117731] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:44.849 [2024-07-25 07:34:17.176441] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:44.849 [2024-07-25 07:34:17.176483] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:44.849 [2024-07-25 07:34:17.176497] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:44.849 [2024-07-25 07:34:17.176505] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.849 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.108 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:45.108 "name": "raid_bdev1", 00:28:45.108 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:45.108 "strip_size_kb": 0, 00:28:45.108 "state": "online", 00:28:45.108 "raid_level": "raid1", 00:28:45.108 "superblock": true, 00:28:45.108 "num_base_bdevs": 2, 00:28:45.108 "num_base_bdevs_discovered": 1, 00:28:45.108 "num_base_bdevs_operational": 1, 00:28:45.108 "base_bdevs_list": [ 00:28:45.108 { 00:28:45.108 "name": null, 00:28:45.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:45.108 "is_configured": false, 00:28:45.108 "data_offset": 256, 00:28:45.108 "data_size": 7936 00:28:45.108 }, 00:28:45.108 { 00:28:45.108 "name": "BaseBdev2", 00:28:45.108 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:45.108 "is_configured": true, 00:28:45.108 "data_offset": 256, 00:28:45.108 "data_size": 7936 00:28:45.108 } 00:28:45.108 ] 00:28:45.108 }' 00:28:45.108 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:45.108 07:34:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:45.673 07:34:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:45.931 [2024-07-25 07:34:18.209809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:45.931 [2024-07-25 07:34:18.209858] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:45.931 [2024-07-25 07:34:18.209882] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227a2d0 00:28:45.931 [2024-07-25 07:34:18.209894] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:45.931 [2024-07-25 07:34:18.210099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:45.931 [2024-07-25 07:34:18.210114] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:45.931 [2024-07-25 07:34:18.210178] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:45.931 [2024-07-25 07:34:18.210190] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:45.931 [2024-07-25 07:34:18.210200] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:45.931 [2024-07-25 07:34:18.210217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:45.931 [2024-07-25 07:34:18.212311] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21e7b40 00:28:45.931 [2024-07-25 07:34:18.213660] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:45.931 spare 00:28:45.931 07:34:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # sleep 1 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.866 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.124 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:47.124 "name": "raid_bdev1", 00:28:47.124 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:47.124 "strip_size_kb": 0, 00:28:47.124 "state": "online", 00:28:47.124 "raid_level": "raid1", 00:28:47.124 "superblock": true, 00:28:47.124 "num_base_bdevs": 2, 00:28:47.124 "num_base_bdevs_discovered": 2, 00:28:47.124 "num_base_bdevs_operational": 2, 00:28:47.124 "process": { 00:28:47.124 "type": "rebuild", 00:28:47.124 "target": "spare", 00:28:47.124 "progress": { 00:28:47.124 "blocks": 3072, 00:28:47.124 "percent": 38 00:28:47.124 } 00:28:47.124 }, 00:28:47.124 "base_bdevs_list": [ 00:28:47.124 { 00:28:47.124 "name": "spare", 00:28:47.124 "uuid": "717ac843-c0f8-5018-8c3d-930887cef6c0", 00:28:47.124 "is_configured": true, 00:28:47.125 "data_offset": 256, 00:28:47.125 "data_size": 7936 00:28:47.125 }, 00:28:47.125 { 00:28:47.125 "name": "BaseBdev2", 00:28:47.125 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:47.125 "is_configured": true, 00:28:47.125 "data_offset": 256, 00:28:47.125 "data_size": 7936 00:28:47.125 } 00:28:47.125 ] 00:28:47.125 }' 00:28:47.125 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:47.125 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:47.125 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:47.125 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:47.125 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:47.384 [2024-07-25 07:34:19.747170] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:47.384 [2024-07-25 07:34:19.825541] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:47.384 [2024-07-25 07:34:19.825586] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:47.384 [2024-07-25 07:34:19.825600] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:47.384 [2024-07-25 07:34:19.825608] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.384 07:34:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.642 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.642 "name": "raid_bdev1", 00:28:47.642 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:47.642 "strip_size_kb": 0, 00:28:47.642 "state": "online", 00:28:47.642 "raid_level": "raid1", 00:28:47.642 "superblock": true, 00:28:47.642 "num_base_bdevs": 2, 00:28:47.642 "num_base_bdevs_discovered": 1, 00:28:47.642 "num_base_bdevs_operational": 1, 00:28:47.642 "base_bdevs_list": [ 00:28:47.642 { 00:28:47.642 "name": null, 00:28:47.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:47.642 "is_configured": false, 00:28:47.642 "data_offset": 256, 00:28:47.642 "data_size": 7936 00:28:47.642 }, 00:28:47.642 { 00:28:47.642 "name": "BaseBdev2", 00:28:47.642 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:47.643 "is_configured": true, 00:28:47.643 "data_offset": 256, 00:28:47.643 "data_size": 7936 00:28:47.643 } 00:28:47.643 ] 00:28:47.643 }' 00:28:47.643 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.643 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.219 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.478 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:48.478 "name": "raid_bdev1", 00:28:48.478 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:48.478 "strip_size_kb": 0, 00:28:48.478 "state": "online", 00:28:48.478 "raid_level": "raid1", 00:28:48.478 "superblock": true, 00:28:48.478 "num_base_bdevs": 2, 00:28:48.478 "num_base_bdevs_discovered": 1, 00:28:48.478 "num_base_bdevs_operational": 1, 00:28:48.478 "base_bdevs_list": [ 00:28:48.478 { 00:28:48.478 "name": null, 00:28:48.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.478 "is_configured": false, 00:28:48.478 "data_offset": 256, 00:28:48.478 "data_size": 7936 00:28:48.478 }, 00:28:48.478 { 00:28:48.478 "name": "BaseBdev2", 00:28:48.478 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:48.478 "is_configured": true, 00:28:48.478 "data_offset": 256, 00:28:48.478 "data_size": 7936 00:28:48.478 } 00:28:48.478 ] 00:28:48.478 }' 00:28:48.478 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:48.478 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:48.478 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:48.478 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:48.478 07:34:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:48.736 07:34:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:48.995 [2024-07-25 07:34:21.416651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:48.995 [2024-07-25 07:34:21.416696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:48.995 [2024-07-25 07:34:21.416715] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227bdb0 00:28:48.995 [2024-07-25 07:34:21.416727] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:48.995 [2024-07-25 07:34:21.416908] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:48.995 [2024-07-25 07:34:21.416922] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:48.995 [2024-07-25 07:34:21.416966] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:48.995 [2024-07-25 07:34:21.416978] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:48.995 [2024-07-25 07:34:21.416988] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:48.995 BaseBdev1 00:28:48.995 07:34:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@789 -- # sleep 1 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:49.930 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.931 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.931 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:50.189 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:50.189 "name": "raid_bdev1", 00:28:50.189 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:50.189 "strip_size_kb": 0, 00:28:50.189 "state": "online", 00:28:50.189 "raid_level": "raid1", 00:28:50.189 "superblock": true, 00:28:50.189 "num_base_bdevs": 2, 00:28:50.189 "num_base_bdevs_discovered": 1, 00:28:50.189 "num_base_bdevs_operational": 1, 00:28:50.189 "base_bdevs_list": [ 00:28:50.189 { 00:28:50.189 "name": null, 00:28:50.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.189 "is_configured": false, 00:28:50.189 "data_offset": 256, 00:28:50.189 "data_size": 7936 00:28:50.189 }, 00:28:50.189 { 00:28:50.189 "name": "BaseBdev2", 00:28:50.189 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:50.189 "is_configured": true, 00:28:50.189 "data_offset": 256, 00:28:50.189 "data_size": 7936 00:28:50.189 } 00:28:50.189 ] 00:28:50.189 }' 00:28:50.189 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:50.189 07:34:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.756 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:51.014 "name": "raid_bdev1", 00:28:51.014 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:51.014 "strip_size_kb": 0, 00:28:51.014 "state": "online", 00:28:51.014 "raid_level": "raid1", 00:28:51.014 "superblock": true, 00:28:51.014 "num_base_bdevs": 2, 00:28:51.014 "num_base_bdevs_discovered": 1, 00:28:51.014 "num_base_bdevs_operational": 1, 00:28:51.014 "base_bdevs_list": [ 00:28:51.014 { 00:28:51.014 "name": null, 00:28:51.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.014 "is_configured": false, 00:28:51.014 "data_offset": 256, 00:28:51.014 "data_size": 7936 00:28:51.014 }, 00:28:51.014 { 00:28:51.014 "name": "BaseBdev2", 00:28:51.014 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:51.014 "is_configured": true, 00:28:51.014 "data_offset": 256, 00:28:51.014 "data_size": 7936 00:28:51.014 } 00:28:51.014 ] 00:28:51.014 }' 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:51.014 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:51.272 [2024-07-25 07:34:23.747070] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:51.272 [2024-07-25 07:34:23.747190] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:51.272 [2024-07-25 07:34:23.747205] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:51.272 request: 00:28:51.272 { 00:28:51.272 "base_bdev": "BaseBdev1", 00:28:51.272 "raid_bdev": "raid_bdev1", 00:28:51.272 "method": "bdev_raid_add_base_bdev", 00:28:51.272 "req_id": 1 00:28:51.272 } 00:28:51.272 Got JSON-RPC error response 00:28:51.272 response: 00:28:51.272 { 00:28:51.272 "code": -22, 00:28:51.272 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:51.272 } 00:28:51.272 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # es=1 00:28:51.272 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:51.272 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:51.272 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:51.272 07:34:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@793 -- # sleep 1 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.649 07:34:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:52.649 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:52.649 "name": "raid_bdev1", 00:28:52.649 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:52.649 "strip_size_kb": 0, 00:28:52.649 "state": "online", 00:28:52.649 "raid_level": "raid1", 00:28:52.649 "superblock": true, 00:28:52.649 "num_base_bdevs": 2, 00:28:52.649 "num_base_bdevs_discovered": 1, 00:28:52.649 "num_base_bdevs_operational": 1, 00:28:52.649 "base_bdevs_list": [ 00:28:52.649 { 00:28:52.649 "name": null, 00:28:52.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.649 "is_configured": false, 00:28:52.649 "data_offset": 256, 00:28:52.649 "data_size": 7936 00:28:52.649 }, 00:28:52.649 { 00:28:52.649 "name": "BaseBdev2", 00:28:52.649 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:52.649 "is_configured": true, 00:28:52.649 "data_offset": 256, 00:28:52.649 "data_size": 7936 00:28:52.649 } 00:28:52.649 ] 00:28:52.649 }' 00:28:52.649 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:52.649 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.212 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:53.470 "name": "raid_bdev1", 00:28:53.470 "uuid": "d6071a54-7de4-457d-b5c3-dd67b8e9e076", 00:28:53.470 "strip_size_kb": 0, 00:28:53.470 "state": "online", 00:28:53.470 "raid_level": "raid1", 00:28:53.470 "superblock": true, 00:28:53.470 "num_base_bdevs": 2, 00:28:53.470 "num_base_bdevs_discovered": 1, 00:28:53.470 "num_base_bdevs_operational": 1, 00:28:53.470 "base_bdevs_list": [ 00:28:53.470 { 00:28:53.470 "name": null, 00:28:53.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.470 "is_configured": false, 00:28:53.470 "data_offset": 256, 00:28:53.470 "data_size": 7936 00:28:53.470 }, 00:28:53.470 { 00:28:53.470 "name": "BaseBdev2", 00:28:53.470 "uuid": "9efac2d4-30ac-5c06-b055-b2a19019fe5e", 00:28:53.470 "is_configured": true, 00:28:53.470 "data_offset": 256, 00:28:53.470 "data_size": 7936 00:28:53.470 } 00:28:53.470 ] 00:28:53.470 }' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@798 -- # killprocess 1767976 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1767976 ']' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1767976 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1767976 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1767976' 00:28:53.470 killing process with pid 1767976 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1767976 00:28:53.470 Received shutdown signal, test time was about 60.000000 seconds 00:28:53.470 00:28:53.470 Latency(us) 00:28:53.470 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:53.470 =================================================================================================================== 00:28:53.470 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:53.470 [2024-07-25 07:34:25.922536] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:53.470 [2024-07-25 07:34:25.922622] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:53.470 [2024-07-25 07:34:25.922661] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:53.470 [2024-07-25 07:34:25.922673] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x227a050 name raid_bdev1, state offline 00:28:53.470 07:34:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1767976 00:28:53.470 [2024-07-25 07:34:25.950221] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:53.728 07:34:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@800 -- # return 0 00:28:53.728 00:28:53.728 real 0m29.987s 00:28:53.728 user 0m46.411s 00:28:53.728 sys 0m4.805s 00:28:53.728 07:34:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:53.728 07:34:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:53.728 ************************************ 00:28:53.728 END TEST raid_rebuild_test_sb_md_separate 00:28:53.728 ************************************ 00:28:53.728 07:34:26 bdev_raid -- bdev/bdev_raid.sh@991 -- # base_malloc_params='-m 32 -i' 00:28:53.728 07:34:26 bdev_raid -- bdev/bdev_raid.sh@992 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:28:53.728 07:34:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:28:53.728 07:34:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:53.728 07:34:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:53.728 ************************************ 00:28:53.728 START TEST raid_state_function_test_sb_md_interleaved 00:28:53.728 ************************************ 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1773473 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1773473' 00:28:53.728 Process raid pid: 1773473 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1773473 /var/tmp/spdk-raid.sock 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1773473 ']' 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:53.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:53.728 07:34:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:53.987 [2024-07-25 07:34:26.295424] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:28:53.987 [2024-07-25 07:34:26.295481] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.0 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.1 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.2 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.3 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.4 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.5 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.6 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:01.7 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.0 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.1 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.2 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.3 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.4 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.5 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.6 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3d:02.7 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.0 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.1 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.2 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.3 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.4 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.5 cannot be used 00:28:53.987 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.987 EAL: Requested device 0000:3f:01.6 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:01.7 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.0 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.1 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.2 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.3 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.4 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.5 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.6 cannot be used 00:28:53.988 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:53.988 EAL: Requested device 0000:3f:02.7 cannot be used 00:28:53.988 [2024-07-25 07:34:26.426820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.988 [2024-07-25 07:34:26.512548] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:28:54.246 [2024-07-25 07:34:26.571692] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:54.246 [2024-07-25 07:34:26.571727] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:54.813 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:54.813 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:28:54.813 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:55.072 [2024-07-25 07:34:27.407361] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:55.072 [2024-07-25 07:34:27.407397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:55.072 [2024-07-25 07:34:27.407407] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:55.072 [2024-07-25 07:34:27.407418] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.072 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:55.331 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:55.331 "name": "Existed_Raid", 00:28:55.331 "uuid": "7f6e017e-9f92-4cf6-b1f6-0e34da365157", 00:28:55.331 "strip_size_kb": 0, 00:28:55.331 "state": "configuring", 00:28:55.331 "raid_level": "raid1", 00:28:55.331 "superblock": true, 00:28:55.331 "num_base_bdevs": 2, 00:28:55.331 "num_base_bdevs_discovered": 0, 00:28:55.331 "num_base_bdevs_operational": 2, 00:28:55.331 "base_bdevs_list": [ 00:28:55.331 { 00:28:55.331 "name": "BaseBdev1", 00:28:55.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:55.331 "is_configured": false, 00:28:55.331 "data_offset": 0, 00:28:55.331 "data_size": 0 00:28:55.331 }, 00:28:55.331 { 00:28:55.331 "name": "BaseBdev2", 00:28:55.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:55.331 "is_configured": false, 00:28:55.331 "data_offset": 0, 00:28:55.331 "data_size": 0 00:28:55.331 } 00:28:55.331 ] 00:28:55.331 }' 00:28:55.331 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:55.331 07:34:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:55.897 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:55.897 [2024-07-25 07:34:28.422041] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:55.897 [2024-07-25 07:34:28.422066] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11d6ea0 name Existed_Raid, state configuring 00:28:56.156 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:56.156 [2024-07-25 07:34:28.650648] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:56.156 [2024-07-25 07:34:28.650675] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:56.156 [2024-07-25 07:34:28.650684] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:56.156 [2024-07-25 07:34:28.650694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:56.156 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:28:56.414 [2024-07-25 07:34:28.884846] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:56.414 BaseBdev1 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:56.414 07:34:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:56.671 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:56.930 [ 00:28:56.930 { 00:28:56.930 "name": "BaseBdev1", 00:28:56.930 "aliases": [ 00:28:56.930 "eee3bdc9-2eee-4d46-9c93-16fc594e9641" 00:28:56.930 ], 00:28:56.930 "product_name": "Malloc disk", 00:28:56.930 "block_size": 4128, 00:28:56.930 "num_blocks": 8192, 00:28:56.930 "uuid": "eee3bdc9-2eee-4d46-9c93-16fc594e9641", 00:28:56.930 "md_size": 32, 00:28:56.930 "md_interleave": true, 00:28:56.930 "dif_type": 0, 00:28:56.930 "assigned_rate_limits": { 00:28:56.930 "rw_ios_per_sec": 0, 00:28:56.930 "rw_mbytes_per_sec": 0, 00:28:56.930 "r_mbytes_per_sec": 0, 00:28:56.930 "w_mbytes_per_sec": 0 00:28:56.930 }, 00:28:56.930 "claimed": true, 00:28:56.930 "claim_type": "exclusive_write", 00:28:56.930 "zoned": false, 00:28:56.930 "supported_io_types": { 00:28:56.930 "read": true, 00:28:56.930 "write": true, 00:28:56.930 "unmap": true, 00:28:56.930 "flush": true, 00:28:56.930 "reset": true, 00:28:56.930 "nvme_admin": false, 00:28:56.930 "nvme_io": false, 00:28:56.930 "nvme_io_md": false, 00:28:56.930 "write_zeroes": true, 00:28:56.930 "zcopy": true, 00:28:56.930 "get_zone_info": false, 00:28:56.930 "zone_management": false, 00:28:56.930 "zone_append": false, 00:28:56.930 "compare": false, 00:28:56.930 "compare_and_write": false, 00:28:56.930 "abort": true, 00:28:56.930 "seek_hole": false, 00:28:56.930 "seek_data": false, 00:28:56.930 "copy": true, 00:28:56.930 "nvme_iov_md": false 00:28:56.930 }, 00:28:56.930 "memory_domains": [ 00:28:56.930 { 00:28:56.930 "dma_device_id": "system", 00:28:56.930 "dma_device_type": 1 00:28:56.930 }, 00:28:56.930 { 00:28:56.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.930 "dma_device_type": 2 00:28:56.930 } 00:28:56.930 ], 00:28:56.930 "driver_specific": {} 00:28:56.930 } 00:28:56.930 ] 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.930 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:57.189 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.189 "name": "Existed_Raid", 00:28:57.189 "uuid": "ccb39fde-b191-4b92-a241-c3548a5c5768", 00:28:57.189 "strip_size_kb": 0, 00:28:57.189 "state": "configuring", 00:28:57.189 "raid_level": "raid1", 00:28:57.189 "superblock": true, 00:28:57.189 "num_base_bdevs": 2, 00:28:57.189 "num_base_bdevs_discovered": 1, 00:28:57.189 "num_base_bdevs_operational": 2, 00:28:57.189 "base_bdevs_list": [ 00:28:57.189 { 00:28:57.189 "name": "BaseBdev1", 00:28:57.189 "uuid": "eee3bdc9-2eee-4d46-9c93-16fc594e9641", 00:28:57.189 "is_configured": true, 00:28:57.189 "data_offset": 256, 00:28:57.189 "data_size": 7936 00:28:57.189 }, 00:28:57.189 { 00:28:57.189 "name": "BaseBdev2", 00:28:57.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.189 "is_configured": false, 00:28:57.189 "data_offset": 0, 00:28:57.189 "data_size": 0 00:28:57.189 } 00:28:57.189 ] 00:28:57.189 }' 00:28:57.189 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.189 07:34:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:57.756 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:58.014 [2024-07-25 07:34:30.348728] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:58.014 [2024-07-25 07:34:30.348764] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11d6790 name Existed_Raid, state configuring 00:28:58.014 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:58.272 [2024-07-25 07:34:30.577359] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:58.273 [2024-07-25 07:34:30.578743] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:58.273 [2024-07-25 07:34:30.578774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.273 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:58.531 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:58.531 "name": "Existed_Raid", 00:28:58.531 "uuid": "3d8c965e-8c3a-41b5-b65b-2659454e52b0", 00:28:58.531 "strip_size_kb": 0, 00:28:58.531 "state": "configuring", 00:28:58.531 "raid_level": "raid1", 00:28:58.531 "superblock": true, 00:28:58.531 "num_base_bdevs": 2, 00:28:58.531 "num_base_bdevs_discovered": 1, 00:28:58.531 "num_base_bdevs_operational": 2, 00:28:58.531 "base_bdevs_list": [ 00:28:58.531 { 00:28:58.531 "name": "BaseBdev1", 00:28:58.531 "uuid": "eee3bdc9-2eee-4d46-9c93-16fc594e9641", 00:28:58.531 "is_configured": true, 00:28:58.531 "data_offset": 256, 00:28:58.531 "data_size": 7936 00:28:58.531 }, 00:28:58.531 { 00:28:58.531 "name": "BaseBdev2", 00:28:58.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.531 "is_configured": false, 00:28:58.531 "data_offset": 0, 00:28:58.531 "data_size": 0 00:28:58.531 } 00:28:58.531 ] 00:28:58.531 }' 00:28:58.531 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:58.531 07:34:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:59.097 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:28:59.098 [2024-07-25 07:34:31.615340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:59.098 [2024-07-25 07:34:31.615458] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1258690 00:28:59.098 [2024-07-25 07:34:31.615470] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:59.098 [2024-07-25 07:34:31.615524] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1369ee0 00:28:59.098 [2024-07-25 07:34:31.615601] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1258690 00:28:59.098 [2024-07-25 07:34:31.615610] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1258690 00:28:59.098 [2024-07-25 07:34:31.615660] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:59.098 BaseBdev2 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:59.356 07:34:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:59.614 [ 00:28:59.614 { 00:28:59.614 "name": "BaseBdev2", 00:28:59.615 "aliases": [ 00:28:59.615 "84d3c192-5533-478d-8992-c94a1360dfd9" 00:28:59.615 ], 00:28:59.615 "product_name": "Malloc disk", 00:28:59.615 "block_size": 4128, 00:28:59.615 "num_blocks": 8192, 00:28:59.615 "uuid": "84d3c192-5533-478d-8992-c94a1360dfd9", 00:28:59.615 "md_size": 32, 00:28:59.615 "md_interleave": true, 00:28:59.615 "dif_type": 0, 00:28:59.615 "assigned_rate_limits": { 00:28:59.615 "rw_ios_per_sec": 0, 00:28:59.615 "rw_mbytes_per_sec": 0, 00:28:59.615 "r_mbytes_per_sec": 0, 00:28:59.615 "w_mbytes_per_sec": 0 00:28:59.615 }, 00:28:59.615 "claimed": true, 00:28:59.615 "claim_type": "exclusive_write", 00:28:59.615 "zoned": false, 00:28:59.615 "supported_io_types": { 00:28:59.615 "read": true, 00:28:59.615 "write": true, 00:28:59.615 "unmap": true, 00:28:59.615 "flush": true, 00:28:59.615 "reset": true, 00:28:59.615 "nvme_admin": false, 00:28:59.615 "nvme_io": false, 00:28:59.615 "nvme_io_md": false, 00:28:59.615 "write_zeroes": true, 00:28:59.615 "zcopy": true, 00:28:59.615 "get_zone_info": false, 00:28:59.615 "zone_management": false, 00:28:59.615 "zone_append": false, 00:28:59.615 "compare": false, 00:28:59.615 "compare_and_write": false, 00:28:59.615 "abort": true, 00:28:59.615 "seek_hole": false, 00:28:59.615 "seek_data": false, 00:28:59.615 "copy": true, 00:28:59.615 "nvme_iov_md": false 00:28:59.615 }, 00:28:59.615 "memory_domains": [ 00:28:59.615 { 00:28:59.615 "dma_device_id": "system", 00:28:59.615 "dma_device_type": 1 00:28:59.615 }, 00:28:59.615 { 00:28:59.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:59.615 "dma_device_type": 2 00:28:59.615 } 00:28:59.615 ], 00:28:59.615 "driver_specific": {} 00:28:59.615 } 00:28:59.615 ] 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.615 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:59.873 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:59.873 "name": "Existed_Raid", 00:28:59.873 "uuid": "3d8c965e-8c3a-41b5-b65b-2659454e52b0", 00:28:59.873 "strip_size_kb": 0, 00:28:59.873 "state": "online", 00:28:59.873 "raid_level": "raid1", 00:28:59.873 "superblock": true, 00:28:59.873 "num_base_bdevs": 2, 00:28:59.873 "num_base_bdevs_discovered": 2, 00:28:59.873 "num_base_bdevs_operational": 2, 00:28:59.873 "base_bdevs_list": [ 00:28:59.873 { 00:28:59.873 "name": "BaseBdev1", 00:28:59.873 "uuid": "eee3bdc9-2eee-4d46-9c93-16fc594e9641", 00:28:59.873 "is_configured": true, 00:28:59.873 "data_offset": 256, 00:28:59.873 "data_size": 7936 00:28:59.873 }, 00:28:59.873 { 00:28:59.873 "name": "BaseBdev2", 00:28:59.873 "uuid": "84d3c192-5533-478d-8992-c94a1360dfd9", 00:28:59.873 "is_configured": true, 00:28:59.873 "data_offset": 256, 00:28:59.873 "data_size": 7936 00:28:59.873 } 00:28:59.873 ] 00:28:59.874 }' 00:28:59.874 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:59.874 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:00.443 07:34:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:00.701 [2024-07-25 07:34:33.035581] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:00.701 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:00.701 "name": "Existed_Raid", 00:29:00.701 "aliases": [ 00:29:00.701 "3d8c965e-8c3a-41b5-b65b-2659454e52b0" 00:29:00.701 ], 00:29:00.701 "product_name": "Raid Volume", 00:29:00.701 "block_size": 4128, 00:29:00.701 "num_blocks": 7936, 00:29:00.701 "uuid": "3d8c965e-8c3a-41b5-b65b-2659454e52b0", 00:29:00.701 "md_size": 32, 00:29:00.701 "md_interleave": true, 00:29:00.701 "dif_type": 0, 00:29:00.701 "assigned_rate_limits": { 00:29:00.701 "rw_ios_per_sec": 0, 00:29:00.701 "rw_mbytes_per_sec": 0, 00:29:00.701 "r_mbytes_per_sec": 0, 00:29:00.701 "w_mbytes_per_sec": 0 00:29:00.701 }, 00:29:00.701 "claimed": false, 00:29:00.701 "zoned": false, 00:29:00.701 "supported_io_types": { 00:29:00.701 "read": true, 00:29:00.701 "write": true, 00:29:00.701 "unmap": false, 00:29:00.701 "flush": false, 00:29:00.701 "reset": true, 00:29:00.701 "nvme_admin": false, 00:29:00.701 "nvme_io": false, 00:29:00.701 "nvme_io_md": false, 00:29:00.701 "write_zeroes": true, 00:29:00.701 "zcopy": false, 00:29:00.701 "get_zone_info": false, 00:29:00.701 "zone_management": false, 00:29:00.701 "zone_append": false, 00:29:00.701 "compare": false, 00:29:00.701 "compare_and_write": false, 00:29:00.701 "abort": false, 00:29:00.701 "seek_hole": false, 00:29:00.701 "seek_data": false, 00:29:00.701 "copy": false, 00:29:00.701 "nvme_iov_md": false 00:29:00.701 }, 00:29:00.701 "memory_domains": [ 00:29:00.701 { 00:29:00.701 "dma_device_id": "system", 00:29:00.701 "dma_device_type": 1 00:29:00.701 }, 00:29:00.701 { 00:29:00.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.701 "dma_device_type": 2 00:29:00.701 }, 00:29:00.701 { 00:29:00.701 "dma_device_id": "system", 00:29:00.701 "dma_device_type": 1 00:29:00.701 }, 00:29:00.701 { 00:29:00.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.701 "dma_device_type": 2 00:29:00.701 } 00:29:00.701 ], 00:29:00.701 "driver_specific": { 00:29:00.701 "raid": { 00:29:00.702 "uuid": "3d8c965e-8c3a-41b5-b65b-2659454e52b0", 00:29:00.702 "strip_size_kb": 0, 00:29:00.702 "state": "online", 00:29:00.702 "raid_level": "raid1", 00:29:00.702 "superblock": true, 00:29:00.702 "num_base_bdevs": 2, 00:29:00.702 "num_base_bdevs_discovered": 2, 00:29:00.702 "num_base_bdevs_operational": 2, 00:29:00.702 "base_bdevs_list": [ 00:29:00.702 { 00:29:00.702 "name": "BaseBdev1", 00:29:00.702 "uuid": "eee3bdc9-2eee-4d46-9c93-16fc594e9641", 00:29:00.702 "is_configured": true, 00:29:00.702 "data_offset": 256, 00:29:00.702 "data_size": 7936 00:29:00.702 }, 00:29:00.702 { 00:29:00.702 "name": "BaseBdev2", 00:29:00.702 "uuid": "84d3c192-5533-478d-8992-c94a1360dfd9", 00:29:00.702 "is_configured": true, 00:29:00.702 "data_offset": 256, 00:29:00.702 "data_size": 7936 00:29:00.702 } 00:29:00.702 ] 00:29:00.702 } 00:29:00.702 } 00:29:00.702 }' 00:29:00.702 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:00.702 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:00.702 BaseBdev2' 00:29:00.702 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:00.702 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:00.702 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:00.960 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:00.960 "name": "BaseBdev1", 00:29:00.960 "aliases": [ 00:29:00.960 "eee3bdc9-2eee-4d46-9c93-16fc594e9641" 00:29:00.960 ], 00:29:00.960 "product_name": "Malloc disk", 00:29:00.960 "block_size": 4128, 00:29:00.960 "num_blocks": 8192, 00:29:00.960 "uuid": "eee3bdc9-2eee-4d46-9c93-16fc594e9641", 00:29:00.960 "md_size": 32, 00:29:00.960 "md_interleave": true, 00:29:00.960 "dif_type": 0, 00:29:00.960 "assigned_rate_limits": { 00:29:00.960 "rw_ios_per_sec": 0, 00:29:00.960 "rw_mbytes_per_sec": 0, 00:29:00.960 "r_mbytes_per_sec": 0, 00:29:00.960 "w_mbytes_per_sec": 0 00:29:00.960 }, 00:29:00.960 "claimed": true, 00:29:00.960 "claim_type": "exclusive_write", 00:29:00.960 "zoned": false, 00:29:00.960 "supported_io_types": { 00:29:00.960 "read": true, 00:29:00.960 "write": true, 00:29:00.960 "unmap": true, 00:29:00.960 "flush": true, 00:29:00.960 "reset": true, 00:29:00.960 "nvme_admin": false, 00:29:00.960 "nvme_io": false, 00:29:00.960 "nvme_io_md": false, 00:29:00.960 "write_zeroes": true, 00:29:00.960 "zcopy": true, 00:29:00.960 "get_zone_info": false, 00:29:00.960 "zone_management": false, 00:29:00.960 "zone_append": false, 00:29:00.960 "compare": false, 00:29:00.960 "compare_and_write": false, 00:29:00.960 "abort": true, 00:29:00.960 "seek_hole": false, 00:29:00.960 "seek_data": false, 00:29:00.960 "copy": true, 00:29:00.960 "nvme_iov_md": false 00:29:00.960 }, 00:29:00.960 "memory_domains": [ 00:29:00.960 { 00:29:00.960 "dma_device_id": "system", 00:29:00.960 "dma_device_type": 1 00:29:00.960 }, 00:29:00.960 { 00:29:00.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.960 "dma_device_type": 2 00:29:00.960 } 00:29:00.960 ], 00:29:00.960 "driver_specific": {} 00:29:00.960 }' 00:29:00.960 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:00.960 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:00.960 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:00.960 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:00.960 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:01.218 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:01.476 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:01.476 "name": "BaseBdev2", 00:29:01.476 "aliases": [ 00:29:01.476 "84d3c192-5533-478d-8992-c94a1360dfd9" 00:29:01.476 ], 00:29:01.476 "product_name": "Malloc disk", 00:29:01.476 "block_size": 4128, 00:29:01.476 "num_blocks": 8192, 00:29:01.476 "uuid": "84d3c192-5533-478d-8992-c94a1360dfd9", 00:29:01.476 "md_size": 32, 00:29:01.476 "md_interleave": true, 00:29:01.476 "dif_type": 0, 00:29:01.476 "assigned_rate_limits": { 00:29:01.476 "rw_ios_per_sec": 0, 00:29:01.476 "rw_mbytes_per_sec": 0, 00:29:01.476 "r_mbytes_per_sec": 0, 00:29:01.476 "w_mbytes_per_sec": 0 00:29:01.476 }, 00:29:01.476 "claimed": true, 00:29:01.476 "claim_type": "exclusive_write", 00:29:01.476 "zoned": false, 00:29:01.476 "supported_io_types": { 00:29:01.476 "read": true, 00:29:01.476 "write": true, 00:29:01.476 "unmap": true, 00:29:01.476 "flush": true, 00:29:01.476 "reset": true, 00:29:01.476 "nvme_admin": false, 00:29:01.476 "nvme_io": false, 00:29:01.476 "nvme_io_md": false, 00:29:01.476 "write_zeroes": true, 00:29:01.476 "zcopy": true, 00:29:01.476 "get_zone_info": false, 00:29:01.476 "zone_management": false, 00:29:01.476 "zone_append": false, 00:29:01.476 "compare": false, 00:29:01.476 "compare_and_write": false, 00:29:01.476 "abort": true, 00:29:01.476 "seek_hole": false, 00:29:01.476 "seek_data": false, 00:29:01.476 "copy": true, 00:29:01.476 "nvme_iov_md": false 00:29:01.476 }, 00:29:01.476 "memory_domains": [ 00:29:01.476 { 00:29:01.476 "dma_device_id": "system", 00:29:01.476 "dma_device_type": 1 00:29:01.476 }, 00:29:01.476 { 00:29:01.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:01.476 "dma_device_type": 2 00:29:01.476 } 00:29:01.476 ], 00:29:01.476 "driver_specific": {} 00:29:01.476 }' 00:29:01.476 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:01.476 07:34:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:01.476 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:01.476 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.733 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.733 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:01.733 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.733 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.733 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:01.733 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.734 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.734 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:01.734 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:01.992 [2024-07-25 07:34:34.447147] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.992 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:02.250 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:02.250 "name": "Existed_Raid", 00:29:02.250 "uuid": "3d8c965e-8c3a-41b5-b65b-2659454e52b0", 00:29:02.250 "strip_size_kb": 0, 00:29:02.250 "state": "online", 00:29:02.250 "raid_level": "raid1", 00:29:02.250 "superblock": true, 00:29:02.250 "num_base_bdevs": 2, 00:29:02.250 "num_base_bdevs_discovered": 1, 00:29:02.250 "num_base_bdevs_operational": 1, 00:29:02.250 "base_bdevs_list": [ 00:29:02.250 { 00:29:02.250 "name": null, 00:29:02.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:02.250 "is_configured": false, 00:29:02.250 "data_offset": 256, 00:29:02.250 "data_size": 7936 00:29:02.250 }, 00:29:02.250 { 00:29:02.250 "name": "BaseBdev2", 00:29:02.250 "uuid": "84d3c192-5533-478d-8992-c94a1360dfd9", 00:29:02.250 "is_configured": true, 00:29:02.251 "data_offset": 256, 00:29:02.251 "data_size": 7936 00:29:02.251 } 00:29:02.251 ] 00:29:02.251 }' 00:29:02.251 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:02.251 07:34:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:02.817 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:02.817 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:02.817 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:02.817 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.076 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:03.076 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:03.076 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:03.335 [2024-07-25 07:34:35.715527] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:03.335 [2024-07-25 07:34:35.715604] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:03.335 [2024-07-25 07:34:35.726171] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:03.335 [2024-07-25 07:34:35.726202] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:03.335 [2024-07-25 07:34:35.726212] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1258690 name Existed_Raid, state offline 00:29:03.335 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:03.335 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:03.335 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.335 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1773473 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1773473 ']' 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1773473 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:03.593 07:34:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1773473 00:29:03.594 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:03.594 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:03.594 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1773473' 00:29:03.594 killing process with pid 1773473 00:29:03.594 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1773473 00:29:03.594 [2024-07-25 07:34:36.033086] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:03.594 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1773473 00:29:03.594 [2024-07-25 07:34:36.033922] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:03.853 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:03.853 00:29:03.853 real 0m9.995s 00:29:03.853 user 0m17.652s 00:29:03.853 sys 0m1.944s 00:29:03.853 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:03.853 07:34:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:03.853 ************************************ 00:29:03.853 END TEST raid_state_function_test_sb_md_interleaved 00:29:03.853 ************************************ 00:29:03.853 07:34:36 bdev_raid -- bdev/bdev_raid.sh@993 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:03.853 07:34:36 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:29:03.853 07:34:36 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:03.853 07:34:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:03.853 ************************************ 00:29:03.853 START TEST raid_superblock_test_md_interleaved 00:29:03.853 ************************************ 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@414 -- # local strip_size 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@427 -- # raid_pid=1775282 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@428 -- # waitforlisten 1775282 /var/tmp/spdk-raid.sock 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1775282 ']' 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:03.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:03.853 07:34:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:03.853 [2024-07-25 07:34:36.371830] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:03.853 [2024-07-25 07:34:36.371887] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1775282 ] 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:04.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.112 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:04.113 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:04.113 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:04.113 [2024-07-25 07:34:36.504855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.113 [2024-07-25 07:34:36.591057] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.371 [2024-07-25 07:34:36.650628] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:04.371 [2024-07-25 07:34:36.650660] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:04.938 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:05.197 malloc1 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:05.197 [2024-07-25 07:34:37.709175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:05.197 [2024-07-25 07:34:37.709218] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:05.197 [2024-07-25 07:34:37.709237] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0c260 00:29:05.197 [2024-07-25 07:34:37.709249] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:05.197 [2024-07-25 07:34:37.710601] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:05.197 [2024-07-25 07:34:37.710627] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:05.197 pt1 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:05.197 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:05.455 malloc2 00:29:05.455 07:34:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:05.714 [2024-07-25 07:34:38.166979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:05.714 [2024-07-25 07:34:38.167020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:05.714 [2024-07-25 07:34:38.167038] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dfe0b0 00:29:05.714 [2024-07-25 07:34:38.167049] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:05.714 [2024-07-25 07:34:38.168349] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:05.714 [2024-07-25 07:34:38.168384] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:05.714 pt2 00:29:05.714 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:29:05.714 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:05.714 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:05.973 [2024-07-25 07:34:38.383568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:05.973 [2024-07-25 07:34:38.384855] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:05.973 [2024-07-25 07:34:38.384994] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1df1f90 00:29:05.973 [2024-07-25 07:34:38.385006] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:05.973 [2024-07-25 07:34:38.385072] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c6ed40 00:29:05.973 [2024-07-25 07:34:38.385160] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1df1f90 00:29:05.973 [2024-07-25 07:34:38.385170] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1df1f90 00:29:05.973 [2024-07-25 07:34:38.385223] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.973 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.231 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.231 "name": "raid_bdev1", 00:29:06.231 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:06.231 "strip_size_kb": 0, 00:29:06.231 "state": "online", 00:29:06.231 "raid_level": "raid1", 00:29:06.231 "superblock": true, 00:29:06.231 "num_base_bdevs": 2, 00:29:06.231 "num_base_bdevs_discovered": 2, 00:29:06.231 "num_base_bdevs_operational": 2, 00:29:06.231 "base_bdevs_list": [ 00:29:06.231 { 00:29:06.231 "name": "pt1", 00:29:06.231 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:06.231 "is_configured": true, 00:29:06.231 "data_offset": 256, 00:29:06.231 "data_size": 7936 00:29:06.231 }, 00:29:06.231 { 00:29:06.231 "name": "pt2", 00:29:06.231 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:06.231 "is_configured": true, 00:29:06.231 "data_offset": 256, 00:29:06.231 "data_size": 7936 00:29:06.231 } 00:29:06.231 ] 00:29:06.231 }' 00:29:06.231 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.231 07:34:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:06.797 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:07.055 [2024-07-25 07:34:39.418508] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:07.055 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:07.055 "name": "raid_bdev1", 00:29:07.055 "aliases": [ 00:29:07.055 "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733" 00:29:07.055 ], 00:29:07.055 "product_name": "Raid Volume", 00:29:07.055 "block_size": 4128, 00:29:07.055 "num_blocks": 7936, 00:29:07.055 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:07.055 "md_size": 32, 00:29:07.055 "md_interleave": true, 00:29:07.055 "dif_type": 0, 00:29:07.055 "assigned_rate_limits": { 00:29:07.055 "rw_ios_per_sec": 0, 00:29:07.055 "rw_mbytes_per_sec": 0, 00:29:07.055 "r_mbytes_per_sec": 0, 00:29:07.055 "w_mbytes_per_sec": 0 00:29:07.055 }, 00:29:07.055 "claimed": false, 00:29:07.055 "zoned": false, 00:29:07.055 "supported_io_types": { 00:29:07.055 "read": true, 00:29:07.055 "write": true, 00:29:07.055 "unmap": false, 00:29:07.055 "flush": false, 00:29:07.055 "reset": true, 00:29:07.055 "nvme_admin": false, 00:29:07.055 "nvme_io": false, 00:29:07.055 "nvme_io_md": false, 00:29:07.055 "write_zeroes": true, 00:29:07.055 "zcopy": false, 00:29:07.055 "get_zone_info": false, 00:29:07.055 "zone_management": false, 00:29:07.055 "zone_append": false, 00:29:07.055 "compare": false, 00:29:07.055 "compare_and_write": false, 00:29:07.055 "abort": false, 00:29:07.055 "seek_hole": false, 00:29:07.055 "seek_data": false, 00:29:07.055 "copy": false, 00:29:07.055 "nvme_iov_md": false 00:29:07.055 }, 00:29:07.055 "memory_domains": [ 00:29:07.055 { 00:29:07.055 "dma_device_id": "system", 00:29:07.055 "dma_device_type": 1 00:29:07.055 }, 00:29:07.055 { 00:29:07.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.055 "dma_device_type": 2 00:29:07.055 }, 00:29:07.055 { 00:29:07.055 "dma_device_id": "system", 00:29:07.055 "dma_device_type": 1 00:29:07.055 }, 00:29:07.055 { 00:29:07.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.055 "dma_device_type": 2 00:29:07.055 } 00:29:07.055 ], 00:29:07.055 "driver_specific": { 00:29:07.055 "raid": { 00:29:07.055 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:07.055 "strip_size_kb": 0, 00:29:07.055 "state": "online", 00:29:07.055 "raid_level": "raid1", 00:29:07.055 "superblock": true, 00:29:07.055 "num_base_bdevs": 2, 00:29:07.055 "num_base_bdevs_discovered": 2, 00:29:07.055 "num_base_bdevs_operational": 2, 00:29:07.055 "base_bdevs_list": [ 00:29:07.055 { 00:29:07.055 "name": "pt1", 00:29:07.055 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:07.055 "is_configured": true, 00:29:07.055 "data_offset": 256, 00:29:07.055 "data_size": 7936 00:29:07.055 }, 00:29:07.055 { 00:29:07.055 "name": "pt2", 00:29:07.055 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:07.055 "is_configured": true, 00:29:07.055 "data_offset": 256, 00:29:07.055 "data_size": 7936 00:29:07.055 } 00:29:07.055 ] 00:29:07.055 } 00:29:07.055 } 00:29:07.055 }' 00:29:07.055 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:07.055 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:07.055 pt2' 00:29:07.055 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:07.055 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:07.055 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:07.313 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:07.313 "name": "pt1", 00:29:07.313 "aliases": [ 00:29:07.313 "00000000-0000-0000-0000-000000000001" 00:29:07.313 ], 00:29:07.313 "product_name": "passthru", 00:29:07.313 "block_size": 4128, 00:29:07.313 "num_blocks": 8192, 00:29:07.313 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:07.313 "md_size": 32, 00:29:07.313 "md_interleave": true, 00:29:07.313 "dif_type": 0, 00:29:07.313 "assigned_rate_limits": { 00:29:07.313 "rw_ios_per_sec": 0, 00:29:07.313 "rw_mbytes_per_sec": 0, 00:29:07.313 "r_mbytes_per_sec": 0, 00:29:07.313 "w_mbytes_per_sec": 0 00:29:07.313 }, 00:29:07.313 "claimed": true, 00:29:07.313 "claim_type": "exclusive_write", 00:29:07.313 "zoned": false, 00:29:07.313 "supported_io_types": { 00:29:07.313 "read": true, 00:29:07.313 "write": true, 00:29:07.313 "unmap": true, 00:29:07.313 "flush": true, 00:29:07.313 "reset": true, 00:29:07.313 "nvme_admin": false, 00:29:07.313 "nvme_io": false, 00:29:07.313 "nvme_io_md": false, 00:29:07.313 "write_zeroes": true, 00:29:07.313 "zcopy": true, 00:29:07.313 "get_zone_info": false, 00:29:07.313 "zone_management": false, 00:29:07.313 "zone_append": false, 00:29:07.314 "compare": false, 00:29:07.314 "compare_and_write": false, 00:29:07.314 "abort": true, 00:29:07.314 "seek_hole": false, 00:29:07.314 "seek_data": false, 00:29:07.314 "copy": true, 00:29:07.314 "nvme_iov_md": false 00:29:07.314 }, 00:29:07.314 "memory_domains": [ 00:29:07.314 { 00:29:07.314 "dma_device_id": "system", 00:29:07.314 "dma_device_type": 1 00:29:07.314 }, 00:29:07.314 { 00:29:07.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.314 "dma_device_type": 2 00:29:07.314 } 00:29:07.314 ], 00:29:07.314 "driver_specific": { 00:29:07.314 "passthru": { 00:29:07.314 "name": "pt1", 00:29:07.314 "base_bdev_name": "malloc1" 00:29:07.314 } 00:29:07.314 } 00:29:07.314 }' 00:29:07.314 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:07.314 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:07.314 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:07.314 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:07.314 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:07.572 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:07.572 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:07.572 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:07.572 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:07.572 07:34:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:07.572 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:07.572 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:07.572 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:07.572 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:07.572 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:07.830 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:07.830 "name": "pt2", 00:29:07.830 "aliases": [ 00:29:07.830 "00000000-0000-0000-0000-000000000002" 00:29:07.830 ], 00:29:07.830 "product_name": "passthru", 00:29:07.830 "block_size": 4128, 00:29:07.830 "num_blocks": 8192, 00:29:07.830 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:07.830 "md_size": 32, 00:29:07.830 "md_interleave": true, 00:29:07.830 "dif_type": 0, 00:29:07.830 "assigned_rate_limits": { 00:29:07.830 "rw_ios_per_sec": 0, 00:29:07.830 "rw_mbytes_per_sec": 0, 00:29:07.830 "r_mbytes_per_sec": 0, 00:29:07.830 "w_mbytes_per_sec": 0 00:29:07.830 }, 00:29:07.830 "claimed": true, 00:29:07.830 "claim_type": "exclusive_write", 00:29:07.830 "zoned": false, 00:29:07.830 "supported_io_types": { 00:29:07.830 "read": true, 00:29:07.830 "write": true, 00:29:07.830 "unmap": true, 00:29:07.830 "flush": true, 00:29:07.830 "reset": true, 00:29:07.830 "nvme_admin": false, 00:29:07.830 "nvme_io": false, 00:29:07.830 "nvme_io_md": false, 00:29:07.830 "write_zeroes": true, 00:29:07.830 "zcopy": true, 00:29:07.830 "get_zone_info": false, 00:29:07.830 "zone_management": false, 00:29:07.830 "zone_append": false, 00:29:07.830 "compare": false, 00:29:07.830 "compare_and_write": false, 00:29:07.830 "abort": true, 00:29:07.830 "seek_hole": false, 00:29:07.830 "seek_data": false, 00:29:07.830 "copy": true, 00:29:07.830 "nvme_iov_md": false 00:29:07.830 }, 00:29:07.830 "memory_domains": [ 00:29:07.830 { 00:29:07.830 "dma_device_id": "system", 00:29:07.830 "dma_device_type": 1 00:29:07.830 }, 00:29:07.830 { 00:29:07.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.830 "dma_device_type": 2 00:29:07.830 } 00:29:07.830 ], 00:29:07.830 "driver_specific": { 00:29:07.830 "passthru": { 00:29:07.830 "name": "pt2", 00:29:07.830 "base_bdev_name": "malloc2" 00:29:07.830 } 00:29:07.830 } 00:29:07.830 }' 00:29:07.830 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:07.830 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:08.087 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:08.345 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:08.345 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:08.345 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:29:08.345 [2024-07-25 07:34:40.850268] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:08.345 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ae0bb4d8-1f3f-49bc-b474-c936fa2ee733 00:29:08.345 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' -z ae0bb4d8-1f3f-49bc-b474-c936fa2ee733 ']' 00:29:08.345 07:34:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:08.603 [2024-07-25 07:34:41.078625] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:08.603 [2024-07-25 07:34:41.078640] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:08.603 [2024-07-25 07:34:41.078686] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:08.603 [2024-07-25 07:34:41.078734] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:08.603 [2024-07-25 07:34:41.078745] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1df1f90 name raid_bdev1, state offline 00:29:08.603 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.603 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:29:08.861 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:29:08.861 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:29:08.861 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:29:08.861 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:09.119 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:29:09.119 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:09.377 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:09.377 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:09.635 07:34:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:09.894 [2024-07-25 07:34:42.209594] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:09.894 [2024-07-25 07:34:42.210831] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:09.894 [2024-07-25 07:34:42.210882] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:09.894 [2024-07-25 07:34:42.210920] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:09.894 [2024-07-25 07:34:42.210937] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:09.894 [2024-07-25 07:34:42.210946] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c70510 name raid_bdev1, state configuring 00:29:09.894 request: 00:29:09.894 { 00:29:09.894 "name": "raid_bdev1", 00:29:09.894 "raid_level": "raid1", 00:29:09.894 "base_bdevs": [ 00:29:09.894 "malloc1", 00:29:09.894 "malloc2" 00:29:09.894 ], 00:29:09.894 "superblock": false, 00:29:09.894 "method": "bdev_raid_create", 00:29:09.894 "req_id": 1 00:29:09.894 } 00:29:09.894 Got JSON-RPC error response 00:29:09.894 response: 00:29:09.894 { 00:29:09.894 "code": -17, 00:29:09.894 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:09.894 } 00:29:09.894 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:29:09.894 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:29:09.894 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:29:09.894 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:29:09.894 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.894 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:29:10.152 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:29:10.152 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:29:10.152 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:10.152 [2024-07-25 07:34:42.670745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:10.152 [2024-07-25 07:34:42.670783] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:10.152 [2024-07-25 07:34:42.670800] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df29c0 00:29:10.152 [2024-07-25 07:34:42.670816] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:10.152 [2024-07-25 07:34:42.672102] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:10.152 [2024-07-25 07:34:42.672128] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:10.152 [2024-07-25 07:34:42.672177] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:10.152 [2024-07-25 07:34:42.672201] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:10.152 pt1 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.410 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.410 "name": "raid_bdev1", 00:29:10.410 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:10.410 "strip_size_kb": 0, 00:29:10.410 "state": "configuring", 00:29:10.410 "raid_level": "raid1", 00:29:10.410 "superblock": true, 00:29:10.410 "num_base_bdevs": 2, 00:29:10.410 "num_base_bdevs_discovered": 1, 00:29:10.410 "num_base_bdevs_operational": 2, 00:29:10.410 "base_bdevs_list": [ 00:29:10.410 { 00:29:10.410 "name": "pt1", 00:29:10.410 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:10.410 "is_configured": true, 00:29:10.410 "data_offset": 256, 00:29:10.410 "data_size": 7936 00:29:10.410 }, 00:29:10.410 { 00:29:10.410 "name": null, 00:29:10.410 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:10.410 "is_configured": false, 00:29:10.410 "data_offset": 256, 00:29:10.410 "data_size": 7936 00:29:10.411 } 00:29:10.411 ] 00:29:10.411 }' 00:29:10.411 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.411 07:34:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:10.988 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:29:10.988 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:29:10.988 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:29:10.988 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:11.248 [2024-07-25 07:34:43.709489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:11.248 [2024-07-25 07:34:43.709535] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:11.248 [2024-07-25 07:34:43.709551] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df0d80 00:29:11.248 [2024-07-25 07:34:43.709562] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:11.248 [2024-07-25 07:34:43.709708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:11.248 [2024-07-25 07:34:43.709724] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:11.248 [2024-07-25 07:34:43.709768] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:11.248 [2024-07-25 07:34:43.709786] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:11.248 [2024-07-25 07:34:43.709867] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1df3ed0 00:29:11.248 [2024-07-25 07:34:43.709877] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:11.248 [2024-07-25 07:34:43.709926] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df5120 00:29:11.248 [2024-07-25 07:34:43.709997] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1df3ed0 00:29:11.248 [2024-07-25 07:34:43.710006] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1df3ed0 00:29:11.248 [2024-07-25 07:34:43.710058] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:11.248 pt2 00:29:11.248 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:29:11.248 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:29:11.248 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:11.248 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:11.248 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:11.248 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.249 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.507 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:11.507 "name": "raid_bdev1", 00:29:11.507 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:11.507 "strip_size_kb": 0, 00:29:11.507 "state": "online", 00:29:11.507 "raid_level": "raid1", 00:29:11.507 "superblock": true, 00:29:11.507 "num_base_bdevs": 2, 00:29:11.507 "num_base_bdevs_discovered": 2, 00:29:11.507 "num_base_bdevs_operational": 2, 00:29:11.507 "base_bdevs_list": [ 00:29:11.507 { 00:29:11.507 "name": "pt1", 00:29:11.507 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:11.507 "is_configured": true, 00:29:11.507 "data_offset": 256, 00:29:11.507 "data_size": 7936 00:29:11.507 }, 00:29:11.507 { 00:29:11.507 "name": "pt2", 00:29:11.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:11.507 "is_configured": true, 00:29:11.507 "data_offset": 256, 00:29:11.507 "data_size": 7936 00:29:11.507 } 00:29:11.507 ] 00:29:11.507 }' 00:29:11.507 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:11.507 07:34:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:12.075 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:12.333 [2024-07-25 07:34:44.696451] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:12.333 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:12.334 "name": "raid_bdev1", 00:29:12.334 "aliases": [ 00:29:12.334 "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733" 00:29:12.334 ], 00:29:12.334 "product_name": "Raid Volume", 00:29:12.334 "block_size": 4128, 00:29:12.334 "num_blocks": 7936, 00:29:12.334 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:12.334 "md_size": 32, 00:29:12.334 "md_interleave": true, 00:29:12.334 "dif_type": 0, 00:29:12.334 "assigned_rate_limits": { 00:29:12.334 "rw_ios_per_sec": 0, 00:29:12.334 "rw_mbytes_per_sec": 0, 00:29:12.334 "r_mbytes_per_sec": 0, 00:29:12.334 "w_mbytes_per_sec": 0 00:29:12.334 }, 00:29:12.334 "claimed": false, 00:29:12.334 "zoned": false, 00:29:12.334 "supported_io_types": { 00:29:12.334 "read": true, 00:29:12.334 "write": true, 00:29:12.334 "unmap": false, 00:29:12.334 "flush": false, 00:29:12.334 "reset": true, 00:29:12.334 "nvme_admin": false, 00:29:12.334 "nvme_io": false, 00:29:12.334 "nvme_io_md": false, 00:29:12.334 "write_zeroes": true, 00:29:12.334 "zcopy": false, 00:29:12.334 "get_zone_info": false, 00:29:12.334 "zone_management": false, 00:29:12.334 "zone_append": false, 00:29:12.334 "compare": false, 00:29:12.334 "compare_and_write": false, 00:29:12.334 "abort": false, 00:29:12.334 "seek_hole": false, 00:29:12.334 "seek_data": false, 00:29:12.334 "copy": false, 00:29:12.334 "nvme_iov_md": false 00:29:12.334 }, 00:29:12.334 "memory_domains": [ 00:29:12.334 { 00:29:12.334 "dma_device_id": "system", 00:29:12.334 "dma_device_type": 1 00:29:12.334 }, 00:29:12.334 { 00:29:12.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:12.334 "dma_device_type": 2 00:29:12.334 }, 00:29:12.334 { 00:29:12.334 "dma_device_id": "system", 00:29:12.334 "dma_device_type": 1 00:29:12.334 }, 00:29:12.334 { 00:29:12.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:12.334 "dma_device_type": 2 00:29:12.334 } 00:29:12.334 ], 00:29:12.334 "driver_specific": { 00:29:12.334 "raid": { 00:29:12.334 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:12.334 "strip_size_kb": 0, 00:29:12.334 "state": "online", 00:29:12.334 "raid_level": "raid1", 00:29:12.334 "superblock": true, 00:29:12.334 "num_base_bdevs": 2, 00:29:12.334 "num_base_bdevs_discovered": 2, 00:29:12.334 "num_base_bdevs_operational": 2, 00:29:12.334 "base_bdevs_list": [ 00:29:12.334 { 00:29:12.334 "name": "pt1", 00:29:12.334 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:12.334 "is_configured": true, 00:29:12.334 "data_offset": 256, 00:29:12.334 "data_size": 7936 00:29:12.334 }, 00:29:12.334 { 00:29:12.334 "name": "pt2", 00:29:12.334 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:12.334 "is_configured": true, 00:29:12.334 "data_offset": 256, 00:29:12.334 "data_size": 7936 00:29:12.334 } 00:29:12.334 ] 00:29:12.334 } 00:29:12.334 } 00:29:12.334 }' 00:29:12.334 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:12.334 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:12.334 pt2' 00:29:12.334 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:12.334 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:12.334 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:12.592 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:12.592 "name": "pt1", 00:29:12.592 "aliases": [ 00:29:12.592 "00000000-0000-0000-0000-000000000001" 00:29:12.592 ], 00:29:12.592 "product_name": "passthru", 00:29:12.592 "block_size": 4128, 00:29:12.592 "num_blocks": 8192, 00:29:12.592 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:12.592 "md_size": 32, 00:29:12.592 "md_interleave": true, 00:29:12.592 "dif_type": 0, 00:29:12.592 "assigned_rate_limits": { 00:29:12.592 "rw_ios_per_sec": 0, 00:29:12.592 "rw_mbytes_per_sec": 0, 00:29:12.592 "r_mbytes_per_sec": 0, 00:29:12.592 "w_mbytes_per_sec": 0 00:29:12.592 }, 00:29:12.592 "claimed": true, 00:29:12.592 "claim_type": "exclusive_write", 00:29:12.592 "zoned": false, 00:29:12.592 "supported_io_types": { 00:29:12.592 "read": true, 00:29:12.592 "write": true, 00:29:12.592 "unmap": true, 00:29:12.592 "flush": true, 00:29:12.592 "reset": true, 00:29:12.592 "nvme_admin": false, 00:29:12.592 "nvme_io": false, 00:29:12.592 "nvme_io_md": false, 00:29:12.592 "write_zeroes": true, 00:29:12.592 "zcopy": true, 00:29:12.592 "get_zone_info": false, 00:29:12.592 "zone_management": false, 00:29:12.592 "zone_append": false, 00:29:12.592 "compare": false, 00:29:12.592 "compare_and_write": false, 00:29:12.592 "abort": true, 00:29:12.592 "seek_hole": false, 00:29:12.592 "seek_data": false, 00:29:12.592 "copy": true, 00:29:12.592 "nvme_iov_md": false 00:29:12.592 }, 00:29:12.592 "memory_domains": [ 00:29:12.592 { 00:29:12.592 "dma_device_id": "system", 00:29:12.592 "dma_device_type": 1 00:29:12.592 }, 00:29:12.592 { 00:29:12.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:12.592 "dma_device_type": 2 00:29:12.592 } 00:29:12.593 ], 00:29:12.593 "driver_specific": { 00:29:12.593 "passthru": { 00:29:12.593 "name": "pt1", 00:29:12.593 "base_bdev_name": "malloc1" 00:29:12.593 } 00:29:12.593 } 00:29:12.593 }' 00:29:12.593 07:34:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:12.593 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:12.593 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:12.593 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:12.593 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:12.851 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:13.109 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:13.109 "name": "pt2", 00:29:13.109 "aliases": [ 00:29:13.109 "00000000-0000-0000-0000-000000000002" 00:29:13.109 ], 00:29:13.109 "product_name": "passthru", 00:29:13.109 "block_size": 4128, 00:29:13.109 "num_blocks": 8192, 00:29:13.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:13.109 "md_size": 32, 00:29:13.109 "md_interleave": true, 00:29:13.109 "dif_type": 0, 00:29:13.109 "assigned_rate_limits": { 00:29:13.109 "rw_ios_per_sec": 0, 00:29:13.109 "rw_mbytes_per_sec": 0, 00:29:13.109 "r_mbytes_per_sec": 0, 00:29:13.109 "w_mbytes_per_sec": 0 00:29:13.109 }, 00:29:13.109 "claimed": true, 00:29:13.109 "claim_type": "exclusive_write", 00:29:13.109 "zoned": false, 00:29:13.109 "supported_io_types": { 00:29:13.109 "read": true, 00:29:13.109 "write": true, 00:29:13.109 "unmap": true, 00:29:13.109 "flush": true, 00:29:13.109 "reset": true, 00:29:13.109 "nvme_admin": false, 00:29:13.109 "nvme_io": false, 00:29:13.109 "nvme_io_md": false, 00:29:13.109 "write_zeroes": true, 00:29:13.109 "zcopy": true, 00:29:13.109 "get_zone_info": false, 00:29:13.109 "zone_management": false, 00:29:13.109 "zone_append": false, 00:29:13.109 "compare": false, 00:29:13.109 "compare_and_write": false, 00:29:13.109 "abort": true, 00:29:13.109 "seek_hole": false, 00:29:13.109 "seek_data": false, 00:29:13.109 "copy": true, 00:29:13.109 "nvme_iov_md": false 00:29:13.109 }, 00:29:13.109 "memory_domains": [ 00:29:13.109 { 00:29:13.109 "dma_device_id": "system", 00:29:13.109 "dma_device_type": 1 00:29:13.109 }, 00:29:13.109 { 00:29:13.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:13.109 "dma_device_type": 2 00:29:13.109 } 00:29:13.109 ], 00:29:13.109 "driver_specific": { 00:29:13.109 "passthru": { 00:29:13.109 "name": "pt2", 00:29:13.109 "base_bdev_name": "malloc2" 00:29:13.109 } 00:29:13.109 } 00:29:13.109 }' 00:29:13.109 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.109 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:13.368 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:13.626 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:29:13.626 07:34:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:13.626 [2024-07-25 07:34:46.112179] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:13.626 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # '[' ae0bb4d8-1f3f-49bc-b474-c936fa2ee733 '!=' ae0bb4d8-1f3f-49bc-b474-c936fa2ee733 ']' 00:29:13.626 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:29:13.626 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:13.626 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:13.626 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:13.885 [2024-07-25 07:34:46.352595] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.885 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:14.144 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:14.144 "name": "raid_bdev1", 00:29:14.144 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:14.144 "strip_size_kb": 0, 00:29:14.144 "state": "online", 00:29:14.144 "raid_level": "raid1", 00:29:14.144 "superblock": true, 00:29:14.144 "num_base_bdevs": 2, 00:29:14.144 "num_base_bdevs_discovered": 1, 00:29:14.144 "num_base_bdevs_operational": 1, 00:29:14.144 "base_bdevs_list": [ 00:29:14.144 { 00:29:14.144 "name": null, 00:29:14.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:14.144 "is_configured": false, 00:29:14.144 "data_offset": 256, 00:29:14.144 "data_size": 7936 00:29:14.144 }, 00:29:14.144 { 00:29:14.144 "name": "pt2", 00:29:14.144 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:14.144 "is_configured": true, 00:29:14.144 "data_offset": 256, 00:29:14.144 "data_size": 7936 00:29:14.144 } 00:29:14.144 ] 00:29:14.144 }' 00:29:14.144 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:14.144 07:34:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:14.710 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:14.968 [2024-07-25 07:34:47.391307] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:14.968 [2024-07-25 07:34:47.391330] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:14.968 [2024-07-25 07:34:47.391379] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:14.968 [2024-07-25 07:34:47.391418] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:14.968 [2024-07-25 07:34:47.391429] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1df3ed0 name raid_bdev1, state offline 00:29:14.968 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:14.968 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:29:15.226 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:29:15.226 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:29:15.226 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:29:15.226 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:29:15.226 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:15.484 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:29:15.484 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:29:15.484 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:29:15.484 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:29:15.484 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@534 -- # i=1 00:29:15.484 07:34:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:15.743 [2024-07-25 07:34:48.077122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:15.743 [2024-07-25 07:34:48.077167] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:15.743 [2024-07-25 07:34:48.077185] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c6f6c0 00:29:15.743 [2024-07-25 07:34:48.077196] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:15.743 [2024-07-25 07:34:48.078507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:15.743 [2024-07-25 07:34:48.078533] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:15.743 [2024-07-25 07:34:48.078577] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:15.743 [2024-07-25 07:34:48.078602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:15.743 [2024-07-25 07:34:48.078664] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1df42f0 00:29:15.743 [2024-07-25 07:34:48.078674] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:15.743 [2024-07-25 07:34:48.078725] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e0bc20 00:29:15.743 [2024-07-25 07:34:48.078790] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1df42f0 00:29:15.743 [2024-07-25 07:34:48.078799] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1df42f0 00:29:15.743 [2024-07-25 07:34:48.078854] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:15.743 pt2 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.743 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:16.002 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:16.002 "name": "raid_bdev1", 00:29:16.002 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:16.002 "strip_size_kb": 0, 00:29:16.002 "state": "online", 00:29:16.002 "raid_level": "raid1", 00:29:16.002 "superblock": true, 00:29:16.002 "num_base_bdevs": 2, 00:29:16.002 "num_base_bdevs_discovered": 1, 00:29:16.002 "num_base_bdevs_operational": 1, 00:29:16.002 "base_bdevs_list": [ 00:29:16.002 { 00:29:16.002 "name": null, 00:29:16.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:16.002 "is_configured": false, 00:29:16.002 "data_offset": 256, 00:29:16.002 "data_size": 7936 00:29:16.002 }, 00:29:16.002 { 00:29:16.002 "name": "pt2", 00:29:16.002 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:16.002 "is_configured": true, 00:29:16.002 "data_offset": 256, 00:29:16.002 "data_size": 7936 00:29:16.002 } 00:29:16.002 ] 00:29:16.002 }' 00:29:16.002 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:16.002 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:16.569 07:34:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:16.827 [2024-07-25 07:34:49.103993] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:16.827 [2024-07-25 07:34:49.104017] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:16.827 [2024-07-25 07:34:49.104061] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:16.827 [2024-07-25 07:34:49.104099] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:16.827 [2024-07-25 07:34:49.104110] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1df42f0 name raid_bdev1, state offline 00:29:16.827 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.827 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:29:16.827 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:29:16.827 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:29:16.827 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:29:16.827 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:17.085 [2024-07-25 07:34:49.557176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:17.085 [2024-07-25 07:34:49.557214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.085 [2024-07-25 07:34:49.557229] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df3910 00:29:17.085 [2024-07-25 07:34:49.557240] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.085 [2024-07-25 07:34:49.558543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.085 [2024-07-25 07:34:49.558568] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:17.085 [2024-07-25 07:34:49.558609] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:17.085 [2024-07-25 07:34:49.558632] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:17.085 [2024-07-25 07:34:49.558702] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:17.085 [2024-07-25 07:34:49.558713] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:17.085 [2024-07-25 07:34:49.558725] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1df5400 name raid_bdev1, state configuring 00:29:17.085 [2024-07-25 07:34:49.558744] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:17.085 [2024-07-25 07:34:49.558794] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1df5400 00:29:17.085 [2024-07-25 07:34:49.558803] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:17.085 [2024-07-25 07:34:49.558856] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df4fc0 00:29:17.085 [2024-07-25 07:34:49.558921] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1df5400 00:29:17.085 [2024-07-25 07:34:49.558930] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1df5400 00:29:17.085 [2024-07-25 07:34:49.558982] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:17.085 pt1 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.085 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.343 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.343 "name": "raid_bdev1", 00:29:17.343 "uuid": "ae0bb4d8-1f3f-49bc-b474-c936fa2ee733", 00:29:17.343 "strip_size_kb": 0, 00:29:17.343 "state": "online", 00:29:17.343 "raid_level": "raid1", 00:29:17.343 "superblock": true, 00:29:17.343 "num_base_bdevs": 2, 00:29:17.343 "num_base_bdevs_discovered": 1, 00:29:17.343 "num_base_bdevs_operational": 1, 00:29:17.343 "base_bdevs_list": [ 00:29:17.343 { 00:29:17.343 "name": null, 00:29:17.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.343 "is_configured": false, 00:29:17.343 "data_offset": 256, 00:29:17.343 "data_size": 7936 00:29:17.343 }, 00:29:17.343 { 00:29:17.343 "name": "pt2", 00:29:17.343 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:17.343 "is_configured": true, 00:29:17.343 "data_offset": 256, 00:29:17.343 "data_size": 7936 00:29:17.343 } 00:29:17.343 ] 00:29:17.343 }' 00:29:17.343 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.343 07:34:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:17.907 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:17.907 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:18.165 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:29:18.165 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:18.165 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:29:18.423 [2024-07-25 07:34:50.780689] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # '[' ae0bb4d8-1f3f-49bc-b474-c936fa2ee733 '!=' ae0bb4d8-1f3f-49bc-b474-c936fa2ee733 ']' 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@578 -- # killprocess 1775282 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1775282 ']' 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1775282 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1775282 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1775282' 00:29:18.423 killing process with pid 1775282 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@969 -- # kill 1775282 00:29:18.423 [2024-07-25 07:34:50.858375] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:18.423 [2024-07-25 07:34:50.858423] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:18.423 [2024-07-25 07:34:50.858464] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:18.423 [2024-07-25 07:34:50.858475] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1df5400 name raid_bdev1, state offline 00:29:18.423 07:34:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@974 -- # wait 1775282 00:29:18.423 [2024-07-25 07:34:50.874412] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:18.681 07:34:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@580 -- # return 0 00:29:18.681 00:29:18.681 real 0m14.752s 00:29:18.681 user 0m26.714s 00:29:18.681 sys 0m2.768s 00:29:18.681 07:34:51 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:18.681 07:34:51 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:18.681 ************************************ 00:29:18.681 END TEST raid_superblock_test_md_interleaved 00:29:18.681 ************************************ 00:29:18.681 07:34:51 bdev_raid -- bdev/bdev_raid.sh@994 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:18.681 07:34:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:29:18.681 07:34:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:18.681 07:34:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:18.681 ************************************ 00:29:18.681 START TEST raid_rebuild_test_sb_md_interleaved 00:29:18.681 ************************************ 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false false 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # local verify=false 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # local strip_size 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # local create_arg 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@594 -- # local data_offset 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # raid_pid=1777987 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # waitforlisten 1777987 /var/tmp/spdk-raid.sock 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1777987 ']' 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:18.681 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:18.681 07:34:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:18.681 [2024-07-25 07:34:51.210794] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:18.681 [2024-07-25 07:34:51.210849] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1777987 ] 00:29:18.681 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:18.681 Zero copy mechanism will not be used. 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:18.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:18.940 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:18.940 [2024-07-25 07:34:51.329564] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:18.940 [2024-07-25 07:34:51.414980] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:19.198 [2024-07-25 07:34:51.476958] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:19.198 [2024-07-25 07:34:51.476989] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:19.763 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:19.763 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:29:19.763 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:29:19.763 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:20.021 BaseBdev1_malloc 00:29:20.021 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:20.021 [2024-07-25 07:34:52.553685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:20.021 [2024-07-25 07:34:52.553727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:20.021 [2024-07-25 07:34:52.553750] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15d5620 00:29:20.021 [2024-07-25 07:34:52.553761] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:20.279 [2024-07-25 07:34:52.555108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:20.279 [2024-07-25 07:34:52.555135] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:20.279 BaseBdev1 00:29:20.279 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:29:20.279 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:20.279 BaseBdev2_malloc 00:29:20.536 07:34:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:20.536 [2024-07-25 07:34:53.023541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:20.536 [2024-07-25 07:34:53.023582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:20.537 [2024-07-25 07:34:53.023607] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ba6b0 00:29:20.537 [2024-07-25 07:34:53.023620] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:20.537 [2024-07-25 07:34:53.024911] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:20.537 [2024-07-25 07:34:53.024936] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:20.537 BaseBdev2 00:29:20.537 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:20.795 spare_malloc 00:29:20.795 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:21.052 spare_delay 00:29:21.052 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:21.310 [2024-07-25 07:34:53.709929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:21.310 [2024-07-25 07:34:53.709969] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:21.310 [2024-07-25 07:34:53.709991] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15bb180 00:29:21.310 [2024-07-25 07:34:53.710003] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:21.310 [2024-07-25 07:34:53.711229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:21.310 [2024-07-25 07:34:53.711255] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:21.310 spare 00:29:21.310 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:21.567 [2024-07-25 07:34:53.934546] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:21.567 [2024-07-25 07:34:53.935690] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:21.567 [2024-07-25 07:34:53.935849] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15c59a0 00:29:21.567 [2024-07-25 07:34:53.935861] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:21.567 [2024-07-25 07:34:53.935922] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1432630 00:29:21.567 [2024-07-25 07:34:53.935997] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15c59a0 00:29:21.567 [2024-07-25 07:34:53.936006] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15c59a0 00:29:21.567 [2024-07-25 07:34:53.936056] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.567 07:34:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.825 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:21.825 "name": "raid_bdev1", 00:29:21.825 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:21.825 "strip_size_kb": 0, 00:29:21.825 "state": "online", 00:29:21.825 "raid_level": "raid1", 00:29:21.825 "superblock": true, 00:29:21.825 "num_base_bdevs": 2, 00:29:21.825 "num_base_bdevs_discovered": 2, 00:29:21.825 "num_base_bdevs_operational": 2, 00:29:21.825 "base_bdevs_list": [ 00:29:21.825 { 00:29:21.825 "name": "BaseBdev1", 00:29:21.825 "uuid": "e0887e3f-4cf5-5ab0-bfbe-a0496031c744", 00:29:21.825 "is_configured": true, 00:29:21.825 "data_offset": 256, 00:29:21.825 "data_size": 7936 00:29:21.825 }, 00:29:21.825 { 00:29:21.825 "name": "BaseBdev2", 00:29:21.825 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:21.825 "is_configured": true, 00:29:21.825 "data_offset": 256, 00:29:21.825 "data_size": 7936 00:29:21.825 } 00:29:21.825 ] 00:29:21.825 }' 00:29:21.825 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:21.825 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:22.390 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:22.390 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:29:22.390 [2024-07-25 07:34:54.905310] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:22.647 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:29:22.647 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.647 07:34:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:22.647 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:29:22.647 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:29:22.647 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # '[' false = true ']' 00:29:22.647 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:22.905 [2024-07-25 07:34:55.366271] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.905 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.162 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.162 "name": "raid_bdev1", 00:29:23.162 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:23.162 "strip_size_kb": 0, 00:29:23.162 "state": "online", 00:29:23.162 "raid_level": "raid1", 00:29:23.162 "superblock": true, 00:29:23.162 "num_base_bdevs": 2, 00:29:23.162 "num_base_bdevs_discovered": 1, 00:29:23.162 "num_base_bdevs_operational": 1, 00:29:23.162 "base_bdevs_list": [ 00:29:23.162 { 00:29:23.162 "name": null, 00:29:23.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.162 "is_configured": false, 00:29:23.162 "data_offset": 256, 00:29:23.162 "data_size": 7936 00:29:23.162 }, 00:29:23.162 { 00:29:23.162 "name": "BaseBdev2", 00:29:23.162 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:23.162 "is_configured": true, 00:29:23.162 "data_offset": 256, 00:29:23.162 "data_size": 7936 00:29:23.162 } 00:29:23.162 ] 00:29:23.162 }' 00:29:23.162 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.162 07:34:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:23.729 07:34:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:23.987 [2024-07-25 07:34:56.368948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:23.987 [2024-07-25 07:34:56.372383] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1432630 00:29:23.987 [2024-07-25 07:34:56.374478] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:23.987 07:34:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.919 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.177 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:25.177 "name": "raid_bdev1", 00:29:25.177 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:25.177 "strip_size_kb": 0, 00:29:25.177 "state": "online", 00:29:25.177 "raid_level": "raid1", 00:29:25.177 "superblock": true, 00:29:25.177 "num_base_bdevs": 2, 00:29:25.177 "num_base_bdevs_discovered": 2, 00:29:25.177 "num_base_bdevs_operational": 2, 00:29:25.177 "process": { 00:29:25.177 "type": "rebuild", 00:29:25.177 "target": "spare", 00:29:25.177 "progress": { 00:29:25.177 "blocks": 3072, 00:29:25.177 "percent": 38 00:29:25.177 } 00:29:25.177 }, 00:29:25.177 "base_bdevs_list": [ 00:29:25.177 { 00:29:25.177 "name": "spare", 00:29:25.177 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:25.177 "is_configured": true, 00:29:25.177 "data_offset": 256, 00:29:25.177 "data_size": 7936 00:29:25.177 }, 00:29:25.177 { 00:29:25.177 "name": "BaseBdev2", 00:29:25.177 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:25.177 "is_configured": true, 00:29:25.177 "data_offset": 256, 00:29:25.177 "data_size": 7936 00:29:25.177 } 00:29:25.177 ] 00:29:25.177 }' 00:29:25.177 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:25.177 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:25.177 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:25.435 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:25.435 07:34:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:25.435 [2024-07-25 07:34:57.923491] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:25.693 [2024-07-25 07:34:57.986264] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:25.693 [2024-07-25 07:34:57.986308] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:25.693 [2024-07-25 07:34:57.986322] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:25.693 [2024-07-25 07:34:57.986330] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.693 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.950 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.950 "name": "raid_bdev1", 00:29:25.950 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:25.950 "strip_size_kb": 0, 00:29:25.950 "state": "online", 00:29:25.950 "raid_level": "raid1", 00:29:25.950 "superblock": true, 00:29:25.950 "num_base_bdevs": 2, 00:29:25.950 "num_base_bdevs_discovered": 1, 00:29:25.950 "num_base_bdevs_operational": 1, 00:29:25.950 "base_bdevs_list": [ 00:29:25.950 { 00:29:25.950 "name": null, 00:29:25.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.950 "is_configured": false, 00:29:25.950 "data_offset": 256, 00:29:25.950 "data_size": 7936 00:29:25.950 }, 00:29:25.950 { 00:29:25.950 "name": "BaseBdev2", 00:29:25.950 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:25.950 "is_configured": true, 00:29:25.950 "data_offset": 256, 00:29:25.950 "data_size": 7936 00:29:25.950 } 00:29:25.950 ] 00:29:25.950 }' 00:29:25.950 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.950 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.515 07:34:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:26.773 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:26.773 "name": "raid_bdev1", 00:29:26.773 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:26.773 "strip_size_kb": 0, 00:29:26.773 "state": "online", 00:29:26.773 "raid_level": "raid1", 00:29:26.773 "superblock": true, 00:29:26.773 "num_base_bdevs": 2, 00:29:26.773 "num_base_bdevs_discovered": 1, 00:29:26.773 "num_base_bdevs_operational": 1, 00:29:26.773 "base_bdevs_list": [ 00:29:26.773 { 00:29:26.773 "name": null, 00:29:26.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.773 "is_configured": false, 00:29:26.773 "data_offset": 256, 00:29:26.773 "data_size": 7936 00:29:26.773 }, 00:29:26.773 { 00:29:26.773 "name": "BaseBdev2", 00:29:26.773 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:26.773 "is_configured": true, 00:29:26.773 "data_offset": 256, 00:29:26.773 "data_size": 7936 00:29:26.773 } 00:29:26.773 ] 00:29:26.773 }' 00:29:26.773 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:26.773 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:26.773 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:26.773 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:26.773 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:27.030 [2024-07-25 07:34:59.353499] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:27.030 [2024-07-25 07:34:59.356966] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1431ac0 00:29:27.030 [2024-07-25 07:34:59.358318] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:27.030 07:34:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@678 -- # sleep 1 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.963 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:28.220 "name": "raid_bdev1", 00:29:28.220 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:28.220 "strip_size_kb": 0, 00:29:28.220 "state": "online", 00:29:28.220 "raid_level": "raid1", 00:29:28.220 "superblock": true, 00:29:28.220 "num_base_bdevs": 2, 00:29:28.220 "num_base_bdevs_discovered": 2, 00:29:28.220 "num_base_bdevs_operational": 2, 00:29:28.220 "process": { 00:29:28.220 "type": "rebuild", 00:29:28.220 "target": "spare", 00:29:28.220 "progress": { 00:29:28.220 "blocks": 3072, 00:29:28.220 "percent": 38 00:29:28.220 } 00:29:28.220 }, 00:29:28.220 "base_bdevs_list": [ 00:29:28.220 { 00:29:28.220 "name": "spare", 00:29:28.220 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:28.220 "is_configured": true, 00:29:28.220 "data_offset": 256, 00:29:28.220 "data_size": 7936 00:29:28.220 }, 00:29:28.220 { 00:29:28.220 "name": "BaseBdev2", 00:29:28.220 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:28.220 "is_configured": true, 00:29:28.220 "data_offset": 256, 00:29:28.220 "data_size": 7936 00:29:28.220 } 00:29:28.220 ] 00:29:28.220 }' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:29:28.220 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # local timeout=1078 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.220 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.478 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:28.478 "name": "raid_bdev1", 00:29:28.478 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:28.478 "strip_size_kb": 0, 00:29:28.478 "state": "online", 00:29:28.478 "raid_level": "raid1", 00:29:28.478 "superblock": true, 00:29:28.478 "num_base_bdevs": 2, 00:29:28.478 "num_base_bdevs_discovered": 2, 00:29:28.478 "num_base_bdevs_operational": 2, 00:29:28.478 "process": { 00:29:28.478 "type": "rebuild", 00:29:28.478 "target": "spare", 00:29:28.478 "progress": { 00:29:28.478 "blocks": 3840, 00:29:28.478 "percent": 48 00:29:28.478 } 00:29:28.478 }, 00:29:28.478 "base_bdevs_list": [ 00:29:28.478 { 00:29:28.478 "name": "spare", 00:29:28.478 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:28.478 "is_configured": true, 00:29:28.478 "data_offset": 256, 00:29:28.478 "data_size": 7936 00:29:28.478 }, 00:29:28.478 { 00:29:28.478 "name": "BaseBdev2", 00:29:28.478 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:28.478 "is_configured": true, 00:29:28.478 "data_offset": 256, 00:29:28.478 "data_size": 7936 00:29:28.478 } 00:29:28.478 ] 00:29:28.478 }' 00:29:28.478 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:28.478 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:28.478 07:35:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:28.735 07:35:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:28.735 07:35:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.668 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:29.926 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:29.926 "name": "raid_bdev1", 00:29:29.926 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:29.926 "strip_size_kb": 0, 00:29:29.926 "state": "online", 00:29:29.926 "raid_level": "raid1", 00:29:29.926 "superblock": true, 00:29:29.926 "num_base_bdevs": 2, 00:29:29.926 "num_base_bdevs_discovered": 2, 00:29:29.926 "num_base_bdevs_operational": 2, 00:29:29.926 "process": { 00:29:29.926 "type": "rebuild", 00:29:29.926 "target": "spare", 00:29:29.926 "progress": { 00:29:29.926 "blocks": 7168, 00:29:29.926 "percent": 90 00:29:29.926 } 00:29:29.926 }, 00:29:29.926 "base_bdevs_list": [ 00:29:29.926 { 00:29:29.926 "name": "spare", 00:29:29.926 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:29.926 "is_configured": true, 00:29:29.926 "data_offset": 256, 00:29:29.926 "data_size": 7936 00:29:29.926 }, 00:29:29.926 { 00:29:29.926 "name": "BaseBdev2", 00:29:29.926 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:29.926 "is_configured": true, 00:29:29.926 "data_offset": 256, 00:29:29.926 "data_size": 7936 00:29:29.926 } 00:29:29.926 ] 00:29:29.926 }' 00:29:29.926 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:29.926 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:29.926 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:29.926 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:29.926 07:35:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:30.184 [2024-07-25 07:35:02.481039] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:30.184 [2024-07-25 07:35:02.481089] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:30.184 [2024-07-25 07:35:02.481173] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.115 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:31.115 "name": "raid_bdev1", 00:29:31.115 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:31.115 "strip_size_kb": 0, 00:29:31.115 "state": "online", 00:29:31.115 "raid_level": "raid1", 00:29:31.115 "superblock": true, 00:29:31.115 "num_base_bdevs": 2, 00:29:31.115 "num_base_bdevs_discovered": 2, 00:29:31.116 "num_base_bdevs_operational": 2, 00:29:31.116 "base_bdevs_list": [ 00:29:31.116 { 00:29:31.116 "name": "spare", 00:29:31.116 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:31.116 "is_configured": true, 00:29:31.116 "data_offset": 256, 00:29:31.116 "data_size": 7936 00:29:31.116 }, 00:29:31.116 { 00:29:31.116 "name": "BaseBdev2", 00:29:31.116 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:31.116 "is_configured": true, 00:29:31.116 "data_offset": 256, 00:29:31.116 "data_size": 7936 00:29:31.116 } 00:29:31.116 ] 00:29:31.116 }' 00:29:31.116 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:31.116 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:31.116 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@724 -- # break 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:31.373 "name": "raid_bdev1", 00:29:31.373 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:31.373 "strip_size_kb": 0, 00:29:31.373 "state": "online", 00:29:31.373 "raid_level": "raid1", 00:29:31.373 "superblock": true, 00:29:31.373 "num_base_bdevs": 2, 00:29:31.373 "num_base_bdevs_discovered": 2, 00:29:31.373 "num_base_bdevs_operational": 2, 00:29:31.373 "base_bdevs_list": [ 00:29:31.373 { 00:29:31.373 "name": "spare", 00:29:31.373 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:31.373 "is_configured": true, 00:29:31.373 "data_offset": 256, 00:29:31.373 "data_size": 7936 00:29:31.373 }, 00:29:31.373 { 00:29:31.373 "name": "BaseBdev2", 00:29:31.373 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:31.373 "is_configured": true, 00:29:31.373 "data_offset": 256, 00:29:31.373 "data_size": 7936 00:29:31.373 } 00:29:31.373 ] 00:29:31.373 }' 00:29:31.373 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.631 07:35:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.889 07:35:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.889 "name": "raid_bdev1", 00:29:31.889 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:31.889 "strip_size_kb": 0, 00:29:31.889 "state": "online", 00:29:31.889 "raid_level": "raid1", 00:29:31.889 "superblock": true, 00:29:31.889 "num_base_bdevs": 2, 00:29:31.889 "num_base_bdevs_discovered": 2, 00:29:31.889 "num_base_bdevs_operational": 2, 00:29:31.889 "base_bdevs_list": [ 00:29:31.889 { 00:29:31.889 "name": "spare", 00:29:31.889 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:31.889 "is_configured": true, 00:29:31.889 "data_offset": 256, 00:29:31.889 "data_size": 7936 00:29:31.889 }, 00:29:31.889 { 00:29:31.889 "name": "BaseBdev2", 00:29:31.889 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:31.889 "is_configured": true, 00:29:31.889 "data_offset": 256, 00:29:31.889 "data_size": 7936 00:29:31.889 } 00:29:31.889 ] 00:29:31.889 }' 00:29:31.889 07:35:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.889 07:35:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:32.450 07:35:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:32.450 [2024-07-25 07:35:04.979185] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:32.450 [2024-07-25 07:35:04.979208] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:32.450 [2024-07-25 07:35:04.979259] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:32.450 [2024-07-25 07:35:04.979310] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:32.450 [2024-07-25 07:35:04.979321] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15c59a0 name raid_bdev1, state offline 00:29:32.708 07:35:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.708 07:35:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # jq length 00:29:32.708 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:29:32.708 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@737 -- # '[' false = true ']' 00:29:32.708 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:29:32.708 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:32.965 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:33.223 [2024-07-25 07:35:05.644889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:33.223 [2024-07-25 07:35:05.644927] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:33.223 [2024-07-25 07:35:05.644947] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15c5c20 00:29:33.223 [2024-07-25 07:35:05.644958] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:33.223 [2024-07-25 07:35:05.646612] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:33.223 [2024-07-25 07:35:05.646638] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:33.223 [2024-07-25 07:35:05.646688] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:33.223 [2024-07-25 07:35:05.646713] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:33.223 [2024-07-25 07:35:05.646791] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:33.223 spare 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.223 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.223 [2024-07-25 07:35:05.747096] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1430640 00:29:33.223 [2024-07-25 07:35:05.747110] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:33.223 [2024-07-25 07:35:05.747185] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15bd9f0 00:29:33.223 [2024-07-25 07:35:05.747268] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1430640 00:29:33.223 [2024-07-25 07:35:05.747277] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1430640 00:29:33.223 [2024-07-25 07:35:05.747343] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:33.480 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:33.480 "name": "raid_bdev1", 00:29:33.480 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:33.480 "strip_size_kb": 0, 00:29:33.480 "state": "online", 00:29:33.480 "raid_level": "raid1", 00:29:33.480 "superblock": true, 00:29:33.480 "num_base_bdevs": 2, 00:29:33.480 "num_base_bdevs_discovered": 2, 00:29:33.480 "num_base_bdevs_operational": 2, 00:29:33.480 "base_bdevs_list": [ 00:29:33.480 { 00:29:33.480 "name": "spare", 00:29:33.480 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:33.480 "is_configured": true, 00:29:33.480 "data_offset": 256, 00:29:33.480 "data_size": 7936 00:29:33.480 }, 00:29:33.480 { 00:29:33.480 "name": "BaseBdev2", 00:29:33.480 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:33.480 "is_configured": true, 00:29:33.480 "data_offset": 256, 00:29:33.480 "data_size": 7936 00:29:33.480 } 00:29:33.480 ] 00:29:33.480 }' 00:29:33.480 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:33.480 07:35:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.045 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:34.303 "name": "raid_bdev1", 00:29:34.303 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:34.303 "strip_size_kb": 0, 00:29:34.303 "state": "online", 00:29:34.303 "raid_level": "raid1", 00:29:34.303 "superblock": true, 00:29:34.303 "num_base_bdevs": 2, 00:29:34.303 "num_base_bdevs_discovered": 2, 00:29:34.303 "num_base_bdevs_operational": 2, 00:29:34.303 "base_bdevs_list": [ 00:29:34.303 { 00:29:34.303 "name": "spare", 00:29:34.303 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:34.303 "is_configured": true, 00:29:34.303 "data_offset": 256, 00:29:34.303 "data_size": 7936 00:29:34.303 }, 00:29:34.303 { 00:29:34.303 "name": "BaseBdev2", 00:29:34.303 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:34.303 "is_configured": true, 00:29:34.303 "data_offset": 256, 00:29:34.303 "data_size": 7936 00:29:34.303 } 00:29:34.303 ] 00:29:34.303 }' 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.303 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:34.561 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:29:34.561 07:35:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:34.818 [2024-07-25 07:35:07.193065] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:34.818 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:34.818 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:34.818 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:34.818 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:34.818 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.819 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:35.076 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:35.076 "name": "raid_bdev1", 00:29:35.076 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:35.076 "strip_size_kb": 0, 00:29:35.076 "state": "online", 00:29:35.076 "raid_level": "raid1", 00:29:35.076 "superblock": true, 00:29:35.076 "num_base_bdevs": 2, 00:29:35.076 "num_base_bdevs_discovered": 1, 00:29:35.076 "num_base_bdevs_operational": 1, 00:29:35.076 "base_bdevs_list": [ 00:29:35.076 { 00:29:35.076 "name": null, 00:29:35.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.076 "is_configured": false, 00:29:35.076 "data_offset": 256, 00:29:35.076 "data_size": 7936 00:29:35.076 }, 00:29:35.076 { 00:29:35.076 "name": "BaseBdev2", 00:29:35.076 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:35.076 "is_configured": true, 00:29:35.076 "data_offset": 256, 00:29:35.076 "data_size": 7936 00:29:35.076 } 00:29:35.076 ] 00:29:35.076 }' 00:29:35.076 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:35.076 07:35:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:35.641 07:35:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:35.901 [2024-07-25 07:35:08.223797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:35.901 [2024-07-25 07:35:08.223931] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:35.901 [2024-07-25 07:35:08.223946] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:35.901 [2024-07-25 07:35:08.223973] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:35.901 [2024-07-25 07:35:08.227323] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15c5fe0 00:29:35.901 [2024-07-25 07:35:08.229350] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:35.901 07:35:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # sleep 1 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.835 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.093 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:37.093 "name": "raid_bdev1", 00:29:37.093 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:37.093 "strip_size_kb": 0, 00:29:37.093 "state": "online", 00:29:37.093 "raid_level": "raid1", 00:29:37.093 "superblock": true, 00:29:37.093 "num_base_bdevs": 2, 00:29:37.093 "num_base_bdevs_discovered": 2, 00:29:37.093 "num_base_bdevs_operational": 2, 00:29:37.093 "process": { 00:29:37.093 "type": "rebuild", 00:29:37.093 "target": "spare", 00:29:37.093 "progress": { 00:29:37.093 "blocks": 3072, 00:29:37.093 "percent": 38 00:29:37.093 } 00:29:37.093 }, 00:29:37.093 "base_bdevs_list": [ 00:29:37.093 { 00:29:37.093 "name": "spare", 00:29:37.093 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:37.093 "is_configured": true, 00:29:37.093 "data_offset": 256, 00:29:37.093 "data_size": 7936 00:29:37.093 }, 00:29:37.093 { 00:29:37.093 "name": "BaseBdev2", 00:29:37.093 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:37.093 "is_configured": true, 00:29:37.093 "data_offset": 256, 00:29:37.093 "data_size": 7936 00:29:37.093 } 00:29:37.093 ] 00:29:37.093 }' 00:29:37.093 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:37.093 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:37.093 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:37.093 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:37.093 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:37.351 [2024-07-25 07:35:09.794362] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:37.351 [2024-07-25 07:35:09.840976] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:37.351 [2024-07-25 07:35:09.841016] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:37.351 [2024-07-25 07:35:09.841030] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:37.351 [2024-07-25 07:35:09.841038] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.351 07:35:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.609 07:35:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:37.609 "name": "raid_bdev1", 00:29:37.609 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:37.609 "strip_size_kb": 0, 00:29:37.609 "state": "online", 00:29:37.609 "raid_level": "raid1", 00:29:37.609 "superblock": true, 00:29:37.609 "num_base_bdevs": 2, 00:29:37.609 "num_base_bdevs_discovered": 1, 00:29:37.609 "num_base_bdevs_operational": 1, 00:29:37.609 "base_bdevs_list": [ 00:29:37.609 { 00:29:37.609 "name": null, 00:29:37.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:37.609 "is_configured": false, 00:29:37.609 "data_offset": 256, 00:29:37.609 "data_size": 7936 00:29:37.609 }, 00:29:37.609 { 00:29:37.609 "name": "BaseBdev2", 00:29:37.609 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:37.609 "is_configured": true, 00:29:37.609 "data_offset": 256, 00:29:37.609 "data_size": 7936 00:29:37.609 } 00:29:37.609 ] 00:29:37.609 }' 00:29:37.609 07:35:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:37.609 07:35:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:38.174 07:35:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:38.431 [2024-07-25 07:35:10.883317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:38.431 [2024-07-25 07:35:10.883360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:38.431 [2024-07-25 07:35:10.883381] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15c77d0 00:29:38.431 [2024-07-25 07:35:10.883393] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:38.431 [2024-07-25 07:35:10.883560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:38.431 [2024-07-25 07:35:10.883574] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:38.431 [2024-07-25 07:35:10.883622] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:38.431 [2024-07-25 07:35:10.883634] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:38.431 [2024-07-25 07:35:10.883644] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:38.431 [2024-07-25 07:35:10.883660] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:38.431 [2024-07-25 07:35:10.886992] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1431ac0 00:29:38.431 [2024-07-25 07:35:10.888345] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:38.431 spare 00:29:38.431 07:35:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # sleep 1 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.804 07:35:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.804 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:39.804 "name": "raid_bdev1", 00:29:39.804 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:39.804 "strip_size_kb": 0, 00:29:39.804 "state": "online", 00:29:39.804 "raid_level": "raid1", 00:29:39.804 "superblock": true, 00:29:39.804 "num_base_bdevs": 2, 00:29:39.804 "num_base_bdevs_discovered": 2, 00:29:39.804 "num_base_bdevs_operational": 2, 00:29:39.804 "process": { 00:29:39.804 "type": "rebuild", 00:29:39.804 "target": "spare", 00:29:39.804 "progress": { 00:29:39.804 "blocks": 3072, 00:29:39.804 "percent": 38 00:29:39.804 } 00:29:39.804 }, 00:29:39.804 "base_bdevs_list": [ 00:29:39.804 { 00:29:39.804 "name": "spare", 00:29:39.804 "uuid": "091cc2e8-56f4-576d-b85b-de3977626774", 00:29:39.804 "is_configured": true, 00:29:39.804 "data_offset": 256, 00:29:39.804 "data_size": 7936 00:29:39.804 }, 00:29:39.804 { 00:29:39.804 "name": "BaseBdev2", 00:29:39.804 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:39.804 "is_configured": true, 00:29:39.804 "data_offset": 256, 00:29:39.804 "data_size": 7936 00:29:39.804 } 00:29:39.804 ] 00:29:39.804 }' 00:29:39.804 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:39.804 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:39.804 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:39.804 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:39.804 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:40.062 [2024-07-25 07:35:12.429324] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:40.062 [2024-07-25 07:35:12.500014] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:40.062 [2024-07-25 07:35:12.500054] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:40.062 [2024-07-25 07:35:12.500068] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:40.062 [2024-07-25 07:35:12.500075] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.062 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.320 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:40.320 "name": "raid_bdev1", 00:29:40.320 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:40.320 "strip_size_kb": 0, 00:29:40.320 "state": "online", 00:29:40.320 "raid_level": "raid1", 00:29:40.320 "superblock": true, 00:29:40.320 "num_base_bdevs": 2, 00:29:40.320 "num_base_bdevs_discovered": 1, 00:29:40.320 "num_base_bdevs_operational": 1, 00:29:40.320 "base_bdevs_list": [ 00:29:40.320 { 00:29:40.320 "name": null, 00:29:40.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:40.320 "is_configured": false, 00:29:40.320 "data_offset": 256, 00:29:40.320 "data_size": 7936 00:29:40.320 }, 00:29:40.320 { 00:29:40.320 "name": "BaseBdev2", 00:29:40.320 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:40.320 "is_configured": true, 00:29:40.320 "data_offset": 256, 00:29:40.320 "data_size": 7936 00:29:40.320 } 00:29:40.320 ] 00:29:40.320 }' 00:29:40.320 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:40.320 07:35:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.885 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:41.143 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:41.143 "name": "raid_bdev1", 00:29:41.143 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:41.143 "strip_size_kb": 0, 00:29:41.143 "state": "online", 00:29:41.143 "raid_level": "raid1", 00:29:41.143 "superblock": true, 00:29:41.143 "num_base_bdevs": 2, 00:29:41.143 "num_base_bdevs_discovered": 1, 00:29:41.143 "num_base_bdevs_operational": 1, 00:29:41.143 "base_bdevs_list": [ 00:29:41.143 { 00:29:41.143 "name": null, 00:29:41.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:41.143 "is_configured": false, 00:29:41.143 "data_offset": 256, 00:29:41.143 "data_size": 7936 00:29:41.143 }, 00:29:41.143 { 00:29:41.143 "name": "BaseBdev2", 00:29:41.143 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:41.143 "is_configured": true, 00:29:41.143 "data_offset": 256, 00:29:41.143 "data_size": 7936 00:29:41.143 } 00:29:41.143 ] 00:29:41.143 }' 00:29:41.143 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:41.143 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:41.143 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:41.143 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:41.143 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:41.401 07:35:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:41.658 [2024-07-25 07:35:14.080027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:41.658 [2024-07-25 07:35:14.080067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:41.658 [2024-07-25 07:35:14.080090] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1431560 00:29:41.658 [2024-07-25 07:35:14.080102] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:41.658 [2024-07-25 07:35:14.080254] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:41.659 [2024-07-25 07:35:14.080270] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:41.659 [2024-07-25 07:35:14.080312] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:41.659 [2024-07-25 07:35:14.080323] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:41.659 [2024-07-25 07:35:14.080333] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:41.659 BaseBdev1 00:29:41.659 07:35:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@789 -- # sleep 1 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.592 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.850 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.850 "name": "raid_bdev1", 00:29:42.850 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:42.850 "strip_size_kb": 0, 00:29:42.850 "state": "online", 00:29:42.850 "raid_level": "raid1", 00:29:42.850 "superblock": true, 00:29:42.850 "num_base_bdevs": 2, 00:29:42.850 "num_base_bdevs_discovered": 1, 00:29:42.850 "num_base_bdevs_operational": 1, 00:29:42.850 "base_bdevs_list": [ 00:29:42.850 { 00:29:42.850 "name": null, 00:29:42.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.850 "is_configured": false, 00:29:42.850 "data_offset": 256, 00:29:42.850 "data_size": 7936 00:29:42.850 }, 00:29:42.850 { 00:29:42.850 "name": "BaseBdev2", 00:29:42.850 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:42.850 "is_configured": true, 00:29:42.850 "data_offset": 256, 00:29:42.850 "data_size": 7936 00:29:42.850 } 00:29:42.850 ] 00:29:42.850 }' 00:29:42.850 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.850 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.417 07:35:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:43.675 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:43.675 "name": "raid_bdev1", 00:29:43.675 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:43.675 "strip_size_kb": 0, 00:29:43.675 "state": "online", 00:29:43.675 "raid_level": "raid1", 00:29:43.675 "superblock": true, 00:29:43.675 "num_base_bdevs": 2, 00:29:43.675 "num_base_bdevs_discovered": 1, 00:29:43.675 "num_base_bdevs_operational": 1, 00:29:43.675 "base_bdevs_list": [ 00:29:43.675 { 00:29:43.675 "name": null, 00:29:43.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:43.676 "is_configured": false, 00:29:43.676 "data_offset": 256, 00:29:43.676 "data_size": 7936 00:29:43.676 }, 00:29:43.676 { 00:29:43.676 "name": "BaseBdev2", 00:29:43.676 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:43.676 "is_configured": true, 00:29:43.676 "data_offset": 256, 00:29:43.676 "data_size": 7936 00:29:43.676 } 00:29:43.676 ] 00:29:43.676 }' 00:29:43.676 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:43.676 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:43.676 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:43.934 [2024-07-25 07:35:16.446441] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:43.934 [2024-07-25 07:35:16.446550] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:43.934 [2024-07-25 07:35:16.446564] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:43.934 request: 00:29:43.934 { 00:29:43.934 "base_bdev": "BaseBdev1", 00:29:43.934 "raid_bdev": "raid_bdev1", 00:29:43.934 "method": "bdev_raid_add_base_bdev", 00:29:43.934 "req_id": 1 00:29:43.934 } 00:29:43.934 Got JSON-RPC error response 00:29:43.934 response: 00:29:43.934 { 00:29:43.934 "code": -22, 00:29:43.934 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:43.934 } 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:29:43.934 07:35:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@793 -- # sleep 1 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.310 "name": "raid_bdev1", 00:29:45.310 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:45.310 "strip_size_kb": 0, 00:29:45.310 "state": "online", 00:29:45.310 "raid_level": "raid1", 00:29:45.310 "superblock": true, 00:29:45.310 "num_base_bdevs": 2, 00:29:45.310 "num_base_bdevs_discovered": 1, 00:29:45.310 "num_base_bdevs_operational": 1, 00:29:45.310 "base_bdevs_list": [ 00:29:45.310 { 00:29:45.310 "name": null, 00:29:45.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.310 "is_configured": false, 00:29:45.310 "data_offset": 256, 00:29:45.310 "data_size": 7936 00:29:45.310 }, 00:29:45.310 { 00:29:45.310 "name": "BaseBdev2", 00:29:45.310 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:45.310 "is_configured": true, 00:29:45.310 "data_offset": 256, 00:29:45.310 "data_size": 7936 00:29:45.310 } 00:29:45.310 ] 00:29:45.310 }' 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.310 07:35:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.877 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:46.136 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:46.136 "name": "raid_bdev1", 00:29:46.136 "uuid": "376d6121-e1f4-4306-a7e1-07c21d447d70", 00:29:46.136 "strip_size_kb": 0, 00:29:46.136 "state": "online", 00:29:46.136 "raid_level": "raid1", 00:29:46.136 "superblock": true, 00:29:46.136 "num_base_bdevs": 2, 00:29:46.136 "num_base_bdevs_discovered": 1, 00:29:46.136 "num_base_bdevs_operational": 1, 00:29:46.136 "base_bdevs_list": [ 00:29:46.136 { 00:29:46.136 "name": null, 00:29:46.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.136 "is_configured": false, 00:29:46.136 "data_offset": 256, 00:29:46.136 "data_size": 7936 00:29:46.136 }, 00:29:46.136 { 00:29:46.136 "name": "BaseBdev2", 00:29:46.136 "uuid": "51a82558-f416-5844-b620-dadab5239800", 00:29:46.136 "is_configured": true, 00:29:46.136 "data_offset": 256, 00:29:46.136 "data_size": 7936 00:29:46.136 } 00:29:46.136 ] 00:29:46.136 }' 00:29:46.136 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:46.136 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:46.136 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:46.136 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:46.136 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@798 -- # killprocess 1777987 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1777987 ']' 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1777987 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1777987 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1777987' 00:29:46.137 killing process with pid 1777987 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1777987 00:29:46.137 Received shutdown signal, test time was about 60.000000 seconds 00:29:46.137 00:29:46.137 Latency(us) 00:29:46.137 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:46.137 =================================================================================================================== 00:29:46.137 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:46.137 [2024-07-25 07:35:18.657977] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:46.137 [2024-07-25 07:35:18.658055] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:46.137 [2024-07-25 07:35:18.658094] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:46.137 [2024-07-25 07:35:18.658106] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1430640 name raid_bdev1, state offline 00:29:46.137 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1777987 00:29:46.395 [2024-07-25 07:35:18.683059] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:46.395 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@800 -- # return 0 00:29:46.395 00:29:46.395 real 0m27.728s 00:29:46.395 user 0m43.824s 00:29:46.395 sys 0m3.666s 00:29:46.395 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:46.395 07:35:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:46.395 ************************************ 00:29:46.395 END TEST raid_rebuild_test_sb_md_interleaved 00:29:46.395 ************************************ 00:29:46.395 07:35:18 bdev_raid -- bdev/bdev_raid.sh@996 -- # trap - EXIT 00:29:46.395 07:35:18 bdev_raid -- bdev/bdev_raid.sh@997 -- # cleanup 00:29:46.395 07:35:18 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1777987 ']' 00:29:46.395 07:35:18 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1777987 00:29:46.654 07:35:18 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:46.654 00:29:46.654 real 17m46.374s 00:29:46.654 user 30m0.626s 00:29:46.654 sys 3m14.415s 00:29:46.654 07:35:18 bdev_raid -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:46.654 07:35:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:46.654 ************************************ 00:29:46.654 END TEST bdev_raid 00:29:46.654 ************************************ 00:29:46.654 07:35:19 -- spdk/autotest.sh@195 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:46.654 07:35:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:29:46.654 07:35:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:46.654 07:35:19 -- common/autotest_common.sh@10 -- # set +x 00:29:46.654 ************************************ 00:29:46.654 START TEST bdevperf_config 00:29:46.654 ************************************ 00:29:46.654 07:35:19 bdevperf_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:46.654 * Looking for test storage... 00:29:46.654 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:46.654 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:46.654 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:46.654 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:46.654 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:46.654 00:29:46.654 07:35:19 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:46.913 07:35:19 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:49.448 07:35:21 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-25 07:35:19.247390] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:49.449 [2024-07-25 07:35:19.247452] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1783108 ] 00:29:49.449 Using job config with 4 jobs 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:49.449 [2024-07-25 07:35:19.395321] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.449 [2024-07-25 07:35:19.503290] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.449 cpumask for '\''job0'\'' is too big 00:29:49.449 cpumask for '\''job1'\'' is too big 00:29:49.449 cpumask for '\''job2'\'' is too big 00:29:49.449 cpumask for '\''job3'\'' is too big 00:29:49.449 Running I/O for 2 seconds... 00:29:49.449 00:29:49.449 Latency(us) 00:29:49.449 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.449 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.449 Malloc0 : 2.01 25968.64 25.36 0.00 0.00 9853.40 1730.15 15099.49 00:29:49.449 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.449 Malloc0 : 2.02 25978.11 25.37 0.00 0.00 9829.69 1743.26 13316.92 00:29:49.449 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.449 Malloc0 : 2.02 25955.66 25.35 0.00 0.00 9817.29 1703.94 11586.76 00:29:49.449 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.449 Malloc0 : 2.02 25933.66 25.33 0.00 0.00 9804.90 1703.94 10171.19 00:29:49.449 =================================================================================================================== 00:29:49.449 Total : 103836.07 101.40 0.00 0.00 9826.29 1703.94 15099.49' 00:29:49.449 07:35:21 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-25 07:35:19.247390] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:49.449 [2024-07-25 07:35:19.247452] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1783108 ] 00:29:49.449 Using job config with 4 jobs 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:49.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.449 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:49.449 [2024-07-25 07:35:19.395321] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.450 [2024-07-25 07:35:19.503290] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.450 cpumask for '\''job0'\'' is too big 00:29:49.450 cpumask for '\''job1'\'' is too big 00:29:49.450 cpumask for '\''job2'\'' is too big 00:29:49.450 cpumask for '\''job3'\'' is too big 00:29:49.450 Running I/O for 2 seconds... 00:29:49.450 00:29:49.450 Latency(us) 00:29:49.450 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.01 25968.64 25.36 0.00 0.00 9853.40 1730.15 15099.49 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.02 25978.11 25.37 0.00 0.00 9829.69 1743.26 13316.92 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.02 25955.66 25.35 0.00 0.00 9817.29 1703.94 11586.76 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.02 25933.66 25.33 0.00 0.00 9804.90 1703.94 10171.19 00:29:49.450 =================================================================================================================== 00:29:49.450 Total : 103836.07 101.40 0.00 0.00 9826.29 1703.94 15099.49' 00:29:49.450 07:35:21 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 07:35:19.247390] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:49.450 [2024-07-25 07:35:19.247452] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1783108 ] 00:29:49.450 Using job config with 4 jobs 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:49.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.450 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:49.450 [2024-07-25 07:35:19.395321] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.450 [2024-07-25 07:35:19.503290] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.450 cpumask for '\''job0'\'' is too big 00:29:49.450 cpumask for '\''job1'\'' is too big 00:29:49.450 cpumask for '\''job2'\'' is too big 00:29:49.450 cpumask for '\''job3'\'' is too big 00:29:49.450 Running I/O for 2 seconds... 00:29:49.450 00:29:49.450 Latency(us) 00:29:49.450 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.01 25968.64 25.36 0.00 0.00 9853.40 1730.15 15099.49 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.02 25978.11 25.37 0.00 0.00 9829.69 1743.26 13316.92 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.02 25955.66 25.35 0.00 0.00 9817.29 1703.94 11586.76 00:29:49.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.450 Malloc0 : 2.02 25933.66 25.33 0.00 0.00 9804.90 1703.94 10171.19 00:29:49.450 =================================================================================================================== 00:29:49.450 Total : 103836.07 101.40 0.00 0.00 9826.29 1703.94 15099.49' 00:29:49.450 07:35:21 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:49.450 07:35:21 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:49.450 07:35:21 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:49.450 07:35:21 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:49.450 [2024-07-25 07:35:21.947245] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:49.450 [2024-07-25 07:35:21.947306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1783477 ] 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:49.718 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:49.718 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:49.718 [2024-07-25 07:35:22.093731] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.718 [2024-07-25 07:35:22.186404] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.977 cpumask for 'job0' is too big 00:29:49.977 cpumask for 'job1' is too big 00:29:49.977 cpumask for 'job2' is too big 00:29:49.977 cpumask for 'job3' is too big 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:29:52.508 Running I/O for 2 seconds... 00:29:52.508 00:29:52.508 Latency(us) 00:29:52.508 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:52.508 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.508 Malloc0 : 2.01 25942.29 25.33 0.00 0.00 9858.26 1703.94 15099.49 00:29:52.508 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.508 Malloc0 : 2.01 25920.18 25.31 0.00 0.00 9845.38 1690.83 13316.92 00:29:52.508 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.508 Malloc0 : 2.02 25960.80 25.35 0.00 0.00 9809.67 1690.83 11639.19 00:29:52.508 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.508 Malloc0 : 2.02 25938.79 25.33 0.00 0.00 9797.64 1690.83 10223.62 00:29:52.508 =================================================================================================================== 00:29:52.508 Total : 103762.05 101.33 0.00 0.00 9827.68 1690.83 15099.49' 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.508 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.508 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.508 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.508 07:35:24 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:55.041 07:35:27 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-25 07:35:24.661062] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:55.041 [2024-07-25 07:35:24.661127] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784005 ] 00:29:55.041 Using job config with 3 jobs 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:55.041 [2024-07-25 07:35:24.809993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.041 [2024-07-25 07:35:24.911494] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.041 cpumask for '\''job0'\'' is too big 00:29:55.041 cpumask for '\''job1'\'' is too big 00:29:55.041 cpumask for '\''job2'\'' is too big 00:29:55.041 Running I/O for 2 seconds... 00:29:55.041 00:29:55.041 Latency(us) 00:29:55.041 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.041 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.041 Malloc0 : 2.01 34987.83 34.17 0.00 0.00 7310.04 1677.72 10695.48 00:29:55.041 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.041 Malloc0 : 2.01 34958.02 34.14 0.00 0.00 7301.28 1664.61 9017.75 00:29:55.041 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.041 Malloc0 : 2.02 34928.26 34.11 0.00 0.00 7292.45 1644.95 7602.18 00:29:55.041 =================================================================================================================== 00:29:55.041 Total : 104874.11 102.42 0.00 0.00 7301.26 1644.95 10695.48' 00:29:55.041 07:35:27 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-25 07:35:24.661062] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:55.041 [2024-07-25 07:35:24.661127] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784005 ] 00:29:55.041 Using job config with 3 jobs 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:55.041 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.041 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:55.042 [2024-07-25 07:35:24.809993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.042 [2024-07-25 07:35:24.911494] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.042 cpumask for '\''job0'\'' is too big 00:29:55.042 cpumask for '\''job1'\'' is too big 00:29:55.042 cpumask for '\''job2'\'' is too big 00:29:55.042 Running I/O for 2 seconds... 00:29:55.042 00:29:55.042 Latency(us) 00:29:55.042 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.042 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.042 Malloc0 : 2.01 34987.83 34.17 0.00 0.00 7310.04 1677.72 10695.48 00:29:55.042 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.042 Malloc0 : 2.01 34958.02 34.14 0.00 0.00 7301.28 1664.61 9017.75 00:29:55.042 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.042 Malloc0 : 2.02 34928.26 34.11 0.00 0.00 7292.45 1644.95 7602.18 00:29:55.042 =================================================================================================================== 00:29:55.042 Total : 104874.11 102.42 0.00 0.00 7301.26 1644.95 10695.48' 00:29:55.042 07:35:27 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 07:35:24.661062] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:55.042 [2024-07-25 07:35:24.661127] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784005 ] 00:29:55.042 Using job config with 3 jobs 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:55.042 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:55.042 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:55.042 [2024-07-25 07:35:24.809993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.042 [2024-07-25 07:35:24.911494] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.042 cpumask for '\''job0'\'' is too big 00:29:55.042 cpumask for '\''job1'\'' is too big 00:29:55.042 cpumask for '\''job2'\'' is too big 00:29:55.042 Running I/O for 2 seconds... 00:29:55.042 00:29:55.042 Latency(us) 00:29:55.042 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.042 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.042 Malloc0 : 2.01 34987.83 34.17 0.00 0.00 7310.04 1677.72 10695.48 00:29:55.042 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.042 Malloc0 : 2.01 34958.02 34.14 0.00 0.00 7301.28 1664.61 9017.75 00:29:55.042 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.042 Malloc0 : 2.02 34928.26 34.11 0.00 0.00 7292.45 1644.95 7602.18 00:29:55.042 =================================================================================================================== 00:29:55.042 Total : 104874.11 102.42 0.00 0.00 7301.26 1644.95 10695.48' 00:29:55.042 07:35:27 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:55.042 07:35:27 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.043 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.043 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.043 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.043 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.043 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.043 07:35:27 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:57.578 07:35:30 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-25 07:35:27.392290] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:57.578 [2024-07-25 07:35:27.392353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784447 ] 00:29:57.578 Using job config with 4 jobs 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:57.578 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.578 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:57.578 [2024-07-25 07:35:27.545707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.578 [2024-07-25 07:35:27.647981] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.578 cpumask for '\''job0'\'' is too big 00:29:57.578 cpumask for '\''job1'\'' is too big 00:29:57.578 cpumask for '\''job2'\'' is too big 00:29:57.578 cpumask for '\''job3'\'' is too big 00:29:57.578 Running I/O for 2 seconds... 00:29:57.578 00:29:57.578 Latency(us) 00:29:57.578 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.578 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.578 Malloc0 : 2.04 12936.33 12.63 0.00 0.00 19778.11 3486.52 30408.70 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.04 12925.17 12.62 0.00 0.00 19777.01 4246.73 30408.70 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.04 12914.36 12.61 0.00 0.00 19727.08 3460.30 26843.55 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.04 12903.21 12.60 0.00 0.00 19727.49 4246.73 26843.55 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.05 12892.45 12.59 0.00 0.00 19678.80 3460.30 23383.24 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.05 12881.42 12.58 0.00 0.00 19677.84 4220.52 23383.24 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.05 12870.59 12.57 0.00 0.00 19631.97 3460.30 20237.52 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.05 12859.57 12.56 0.00 0.00 19632.48 4246.73 20237.52 00:29:57.579 =================================================================================================================== 00:29:57.579 Total : 103183.10 100.76 0.00 0.00 19703.85 3460.30 30408.70' 00:29:57.579 07:35:30 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-25 07:35:27.392290] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:57.579 [2024-07-25 07:35:27.392353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784447 ] 00:29:57.579 Using job config with 4 jobs 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:57.579 [2024-07-25 07:35:27.545707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.579 [2024-07-25 07:35:27.647981] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.579 cpumask for '\''job0'\'' is too big 00:29:57.579 cpumask for '\''job1'\'' is too big 00:29:57.579 cpumask for '\''job2'\'' is too big 00:29:57.579 cpumask for '\''job3'\'' is too big 00:29:57.579 Running I/O for 2 seconds... 00:29:57.579 00:29:57.579 Latency(us) 00:29:57.579 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.04 12936.33 12.63 0.00 0.00 19778.11 3486.52 30408.70 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.04 12925.17 12.62 0.00 0.00 19777.01 4246.73 30408.70 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.04 12914.36 12.61 0.00 0.00 19727.08 3460.30 26843.55 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.04 12903.21 12.60 0.00 0.00 19727.49 4246.73 26843.55 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.05 12892.45 12.59 0.00 0.00 19678.80 3460.30 23383.24 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.05 12881.42 12.58 0.00 0.00 19677.84 4220.52 23383.24 00:29:57.579 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc0 : 2.05 12870.59 12.57 0.00 0.00 19631.97 3460.30 20237.52 00:29:57.579 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.579 Malloc1 : 2.05 12859.57 12.56 0.00 0.00 19632.48 4246.73 20237.52 00:29:57.579 =================================================================================================================== 00:29:57.579 Total : 103183.10 100.76 0.00 0.00 19703.85 3460.30 30408.70' 00:29:57.579 07:35:30 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:57.579 07:35:30 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 07:35:27.392290] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:57.579 [2024-07-25 07:35:27.392353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784447 ] 00:29:57.579 Using job config with 4 jobs 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.579 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:57.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:57.580 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:57.580 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:57.580 [2024-07-25 07:35:27.545707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.580 [2024-07-25 07:35:27.647981] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.580 cpumask for '\''job0'\'' is too big 00:29:57.580 cpumask for '\''job1'\'' is too big 00:29:57.580 cpumask for '\''job2'\'' is too big 00:29:57.580 cpumask for '\''job3'\'' is too big 00:29:57.580 Running I/O for 2 seconds... 00:29:57.580 00:29:57.580 Latency(us) 00:29:57.580 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.580 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc0 : 2.04 12936.33 12.63 0.00 0.00 19778.11 3486.52 30408.70 00:29:57.580 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc1 : 2.04 12925.17 12.62 0.00 0.00 19777.01 4246.73 30408.70 00:29:57.580 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc0 : 2.04 12914.36 12.61 0.00 0.00 19727.08 3460.30 26843.55 00:29:57.580 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc1 : 2.04 12903.21 12.60 0.00 0.00 19727.49 4246.73 26843.55 00:29:57.580 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc0 : 2.05 12892.45 12.59 0.00 0.00 19678.80 3460.30 23383.24 00:29:57.580 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc1 : 2.05 12881.42 12.58 0.00 0.00 19677.84 4220.52 23383.24 00:29:57.580 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc0 : 2.05 12870.59 12.57 0.00 0.00 19631.97 3460.30 20237.52 00:29:57.580 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:57.580 Malloc1 : 2.05 12859.57 12.56 0.00 0.00 19632.48 4246.73 20237.52 00:29:57.580 =================================================================================================================== 00:29:57.580 Total : 103183.10 100.76 0.00 0.00 19703.85 3460.30 30408.70' 00:29:57.580 07:35:30 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:57.580 07:35:30 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:29:57.580 07:35:30 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:29:57.580 07:35:30 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:57.580 07:35:30 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:29:57.580 00:29:57.580 real 0m11.032s 00:29:57.580 user 0m9.741s 00:29:57.580 sys 0m1.137s 00:29:57.580 07:35:30 bdevperf_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:57.580 07:35:30 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:29:57.580 ************************************ 00:29:57.580 END TEST bdevperf_config 00:29:57.580 ************************************ 00:29:57.840 07:35:30 -- spdk/autotest.sh@196 -- # uname -s 00:29:57.840 07:35:30 -- spdk/autotest.sh@196 -- # [[ Linux == Linux ]] 00:29:57.840 07:35:30 -- spdk/autotest.sh@197 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:57.840 07:35:30 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:29:57.840 07:35:30 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:57.840 07:35:30 -- common/autotest_common.sh@10 -- # set +x 00:29:57.840 ************************************ 00:29:57.840 START TEST reactor_set_interrupt 00:29:57.840 ************************************ 00:29:57.840 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:57.840 * Looking for test storage... 00:29:57.840 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:57.840 07:35:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:57.841 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:57.841 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:57.841 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:57.841 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:57.841 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:57.841 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:57.841 07:35:30 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:57.841 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:57.841 07:35:30 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:57.842 07:35:30 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:57.842 07:35:30 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:57.842 07:35:30 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:57.842 #define SPDK_CONFIG_H 00:29:57.842 #define SPDK_CONFIG_APPS 1 00:29:57.842 #define SPDK_CONFIG_ARCH native 00:29:57.842 #undef SPDK_CONFIG_ASAN 00:29:57.842 #undef SPDK_CONFIG_AVAHI 00:29:57.842 #undef SPDK_CONFIG_CET 00:29:57.842 #define SPDK_CONFIG_COVERAGE 1 00:29:57.842 #define SPDK_CONFIG_CROSS_PREFIX 00:29:57.842 #define SPDK_CONFIG_CRYPTO 1 00:29:57.842 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:57.842 #undef SPDK_CONFIG_CUSTOMOCF 00:29:57.842 #undef SPDK_CONFIG_DAOS 00:29:57.842 #define SPDK_CONFIG_DAOS_DIR 00:29:57.842 #define SPDK_CONFIG_DEBUG 1 00:29:57.842 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:57.842 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:57.842 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:57.842 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:57.842 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:57.842 #undef SPDK_CONFIG_DPDK_UADK 00:29:57.842 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:57.842 #define SPDK_CONFIG_EXAMPLES 1 00:29:57.842 #undef SPDK_CONFIG_FC 00:29:57.842 #define SPDK_CONFIG_FC_PATH 00:29:57.842 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:57.842 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:57.842 #undef SPDK_CONFIG_FUSE 00:29:57.842 #undef SPDK_CONFIG_FUZZER 00:29:57.842 #define SPDK_CONFIG_FUZZER_LIB 00:29:57.842 #undef SPDK_CONFIG_GOLANG 00:29:57.842 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:57.842 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:57.842 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:57.842 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:57.842 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:57.842 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:57.842 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:57.842 #define SPDK_CONFIG_IDXD 1 00:29:57.842 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:57.842 #define SPDK_CONFIG_IPSEC_MB 1 00:29:57.842 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:57.842 #define SPDK_CONFIG_ISAL 1 00:29:57.842 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:57.842 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:57.842 #define SPDK_CONFIG_LIBDIR 00:29:57.842 #undef SPDK_CONFIG_LTO 00:29:57.842 #define SPDK_CONFIG_MAX_LCORES 128 00:29:57.842 #define SPDK_CONFIG_NVME_CUSE 1 00:29:57.842 #undef SPDK_CONFIG_OCF 00:29:57.842 #define SPDK_CONFIG_OCF_PATH 00:29:57.842 #define SPDK_CONFIG_OPENSSL_PATH 00:29:57.842 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:57.842 #define SPDK_CONFIG_PGO_DIR 00:29:57.842 #undef SPDK_CONFIG_PGO_USE 00:29:57.842 #define SPDK_CONFIG_PREFIX /usr/local 00:29:57.842 #undef SPDK_CONFIG_RAID5F 00:29:57.842 #undef SPDK_CONFIG_RBD 00:29:57.842 #define SPDK_CONFIG_RDMA 1 00:29:57.842 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:57.842 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:57.842 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:57.842 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:57.842 #define SPDK_CONFIG_SHARED 1 00:29:57.842 #undef SPDK_CONFIG_SMA 00:29:57.842 #define SPDK_CONFIG_TESTS 1 00:29:57.842 #undef SPDK_CONFIG_TSAN 00:29:57.842 #define SPDK_CONFIG_UBLK 1 00:29:57.842 #define SPDK_CONFIG_UBSAN 1 00:29:57.842 #undef SPDK_CONFIG_UNIT_TESTS 00:29:57.842 #undef SPDK_CONFIG_URING 00:29:57.842 #define SPDK_CONFIG_URING_PATH 00:29:57.842 #undef SPDK_CONFIG_URING_ZNS 00:29:57.842 #undef SPDK_CONFIG_USDT 00:29:57.842 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:57.842 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:57.842 #undef SPDK_CONFIG_VFIO_USER 00:29:57.842 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:57.842 #define SPDK_CONFIG_VHOST 1 00:29:57.842 #define SPDK_CONFIG_VIRTIO 1 00:29:57.842 #undef SPDK_CONFIG_VTUNE 00:29:57.842 #define SPDK_CONFIG_VTUNE_DIR 00:29:57.842 #define SPDK_CONFIG_WERROR 1 00:29:57.842 #define SPDK_CONFIG_WPDK_DIR 00:29:57.842 #undef SPDK_CONFIG_XNVME 00:29:57.842 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:57.842 07:35:30 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:57.842 07:35:30 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.842 07:35:30 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.842 07:35:30 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.842 07:35:30 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:29:57.842 07:35:30 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:57.842 07:35:30 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:57.842 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:29:57.843 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 1 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@173 -- # : 0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@202 -- # cat 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:29:58.104 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@265 -- # export valgrind= 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@265 -- # valgrind= 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@271 -- # uname -s 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@281 -- # MAKE=make 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j112 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@301 -- # TEST_MODE= 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@320 -- # [[ -z 1784866 ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@320 -- # kill -0 1784866 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local mount target_dir 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.KVTZXu 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.KVTZXu/tests/interrupt /tmp/spdk.KVTZXu 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@329 -- # df -T 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=954302464 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4330127360 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=55082147840 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=61742305280 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=6660157440 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=30866341888 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871150592 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4808704 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=12338696192 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=12348461056 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=9764864 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=30870024192 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871154688 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=1130496 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=6174224384 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=6174228480 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:29:58.105 * Looking for test storage... 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@370 -- # local target_space new_size 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@374 -- # mount=/ 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@376 -- # target_space=55082147840 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:29:58.105 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@383 -- # new_size=8874749952 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.106 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@391 -- # return 0 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1785010 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:58.106 07:35:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1785010 /var/tmp/spdk.sock 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1785010 ']' 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:58.106 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:58.106 07:35:30 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:58.106 [2024-07-25 07:35:30.493914] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:29:58.106 [2024-07-25 07:35:30.493977] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1785010 ] 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:58.106 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:58.106 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:58.106 [2024-07-25 07:35:30.626194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:58.366 [2024-07-25 07:35:30.714596] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:29:58.366 [2024-07-25 07:35:30.714690] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:29:58.366 [2024-07-25 07:35:30.714694] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:29:58.366 [2024-07-25 07:35:30.784317] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:58.933 07:35:31 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:58.934 07:35:31 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:29:58.934 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:29:58.934 07:35:31 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:59.192 Malloc0 00:29:59.192 Malloc1 00:29:59.192 Malloc2 00:29:59.192 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:29:59.192 07:35:31 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:59.192 07:35:31 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:59.192 07:35:31 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:59.192 5000+0 records in 00:29:59.192 5000+0 records out 00:29:59.192 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0268057 s, 382 MB/s 00:29:59.192 07:35:31 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:59.451 AIO0 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1785010 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1785010 without_thd 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1785010 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:59.451 07:35:31 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:59.709 07:35:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:59.709 07:35:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:59.710 07:35:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:59.969 spdk_thread ids are 1 on reactor0. 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1785010 0 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785010 0 idle 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:29:59.969 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785010 root 20 0 128.2g 36736 24192 S 6.7 0.1 0:00.38 reactor_0' 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785010 root 20 0 128.2g 36736 24192 S 6.7 0.1 0:00.38 reactor_0 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1785010 1 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785010 1 idle 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:00.227 07:35:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:00.228 07:35:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:00.228 07:35:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:00.228 07:35:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:00.228 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:30:00.228 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785053 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_1' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785053 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_1 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1785010 2 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785010 2 idle 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785054 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_2' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785054 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_2 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:00.486 07:35:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:00.745 [2024-07-25 07:35:33.203620] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:00.745 07:35:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:01.008 [2024-07-25 07:35:33.443363] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:01.008 [2024-07-25 07:35:33.443661] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:01.008 07:35:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:01.266 [2024-07-25 07:35:33.667270] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:01.266 [2024-07-25 07:35:33.667456] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1785010 0 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1785010 0 busy 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:30:01.266 07:35:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785010 root 20 0 128.2g 36736 24192 R 99.9 0.1 0:00.78 reactor_0' 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785010 root 20 0 128.2g 36736 24192 R 99.9 0.1 0:00.78 reactor_0 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1785010 2 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1785010 2 busy 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:01.524 07:35:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:30:01.525 07:35:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785054 root 20 0 128.2g 36736 24192 R 93.8 0.1 0:00.36 reactor_2' 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785054 root 20 0 128.2g 36736 24192 R 93.8 0.1 0:00.36 reactor_2 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:01.525 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:01.783 [2024-07-25 07:35:34.259262] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:01.783 [2024-07-25 07:35:34.259355] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1785010 2 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785010 2 idle 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:30:01.783 07:35:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785054 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.59 reactor_2' 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785054 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.59 reactor_2 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:02.041 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:02.299 [2024-07-25 07:35:34.667259] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:02.299 [2024-07-25 07:35:34.667382] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:02.299 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:02.299 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:02.299 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:02.557 [2024-07-25 07:35:34.895482] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1785010 0 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785010 0 idle 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785010 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785010 -w 256 00:30:02.557 07:35:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785010 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:01.60 reactor_0' 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785010 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:01.60 reactor_0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:02.557 07:35:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1785010 00:30:02.557 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1785010 ']' 00:30:02.557 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1785010 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1785010 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1785010' 00:30:02.816 killing process with pid 1785010 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1785010 00:30:02.816 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1785010 00:30:02.816 07:35:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:02.816 07:35:35 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1785885 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1785885 /var/tmp/spdk.sock 00:30:03.074 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1785885 ']' 00:30:03.074 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:03.074 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:03.074 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:03.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:03.074 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:03.074 07:35:35 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:03.074 07:35:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:03.074 [2024-07-25 07:35:35.416440] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:30:03.074 [2024-07-25 07:35:35.416572] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1785885 ] 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:03.074 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.074 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:03.075 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:03.075 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:03.333 [2024-07-25 07:35:35.619995] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:03.333 [2024-07-25 07:35:35.705446] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:30:03.333 [2024-07-25 07:35:35.705539] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:30:03.333 [2024-07-25 07:35:35.705544] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:30:03.333 [2024-07-25 07:35:35.773361] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:03.902 07:35:36 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:03.902 07:35:36 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:30:03.902 07:35:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:03.902 07:35:36 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:04.469 Malloc0 00:30:04.469 Malloc1 00:30:04.469 Malloc2 00:30:04.469 07:35:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:04.469 07:35:36 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:04.469 07:35:36 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:04.469 07:35:36 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:04.469 5000+0 records in 00:30:04.469 5000+0 records out 00:30:04.469 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0366421 s, 279 MB/s 00:30:04.469 07:35:36 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:04.727 AIO0 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1785885 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1785885 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1785885 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:04.727 07:35:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:04.986 07:35:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:05.244 spdk_thread ids are 1 on reactor0. 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1785885 0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785885 0 idle 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785885 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.48 reactor_0' 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785885 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.48 reactor_0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:05.244 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1785885 1 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785885 1 idle 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:05.245 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785965 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_1' 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785965 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_1 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1785885 2 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785885 2 idle 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:05.503 07:35:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785966 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_2' 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785966 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.00 reactor_2 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:05.761 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:06.019 [2024-07-25 07:35:38.334100] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:06.019 [2024-07-25 07:35:38.334316] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:06.019 [2024-07-25 07:35:38.334397] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:06.019 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:06.277 [2024-07-25 07:35:38.566600] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:06.277 [2024-07-25 07:35:38.566772] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1785885 0 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1785885 0 busy 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785885 root 20 0 128.2g 36736 24192 R 99.9 0.1 0:00.89 reactor_0' 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785885 root 20 0 128.2g 36736 24192 R 99.9 0.1 0:00.89 reactor_0 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1785885 2 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1785885 2 busy 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:06.277 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785966 root 20 0 128.2g 36736 24192 R 99.9 0.1 0:00.36 reactor_2' 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785966 root 20 0 128.2g 36736 24192 R 99.9 0.1 0:00.36 reactor_2 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.535 07:35:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:06.794 [2024-07-25 07:35:39.160271] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:06.794 [2024-07-25 07:35:39.160356] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1785885 2 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785885 2 idle 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:06.794 07:35:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785966 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.59 reactor_2' 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785966 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:00.59 reactor_2 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:07.052 [2024-07-25 07:35:39.561290] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:07.052 [2024-07-25 07:35:39.561542] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:07.052 [2024-07-25 07:35:39.561573] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1785885 0 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1785885 0 idle 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1785885 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.052 07:35:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1785885 -w 256 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1785885 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:01.70 reactor_0' 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1785885 root 20 0 128.2g 36736 24192 S 0.0 0.1 0:01.70 reactor_0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:07.311 07:35:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1785885 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1785885 ']' 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1785885 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1785885 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1785885' 00:30:07.311 killing process with pid 1785885 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1785885 00:30:07.311 07:35:39 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1785885 00:30:07.569 07:35:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:07.569 07:35:40 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:07.569 00:30:07.569 real 0m9.871s 00:30:07.569 user 0m9.168s 00:30:07.569 sys 0m2.165s 00:30:07.569 07:35:40 reactor_set_interrupt -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:07.569 07:35:40 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:07.569 ************************************ 00:30:07.569 END TEST reactor_set_interrupt 00:30:07.569 ************************************ 00:30:07.569 07:35:40 -- spdk/autotest.sh@198 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:07.569 07:35:40 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:30:07.569 07:35:40 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:07.569 07:35:40 -- common/autotest_common.sh@10 -- # set +x 00:30:07.829 ************************************ 00:30:07.829 START TEST reap_unregistered_poller 00:30:07.829 ************************************ 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:07.829 * Looking for test storage... 00:30:07.829 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:07.829 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:07.829 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:07.829 07:35:40 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:07.830 07:35:40 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:07.830 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:07.830 #define SPDK_CONFIG_H 00:30:07.830 #define SPDK_CONFIG_APPS 1 00:30:07.830 #define SPDK_CONFIG_ARCH native 00:30:07.830 #undef SPDK_CONFIG_ASAN 00:30:07.830 #undef SPDK_CONFIG_AVAHI 00:30:07.830 #undef SPDK_CONFIG_CET 00:30:07.830 #define SPDK_CONFIG_COVERAGE 1 00:30:07.830 #define SPDK_CONFIG_CROSS_PREFIX 00:30:07.830 #define SPDK_CONFIG_CRYPTO 1 00:30:07.830 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:07.830 #undef SPDK_CONFIG_CUSTOMOCF 00:30:07.830 #undef SPDK_CONFIG_DAOS 00:30:07.830 #define SPDK_CONFIG_DAOS_DIR 00:30:07.830 #define SPDK_CONFIG_DEBUG 1 00:30:07.830 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:07.830 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:07.830 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:07.830 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:07.830 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:07.830 #undef SPDK_CONFIG_DPDK_UADK 00:30:07.830 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:07.830 #define SPDK_CONFIG_EXAMPLES 1 00:30:07.830 #undef SPDK_CONFIG_FC 00:30:07.830 #define SPDK_CONFIG_FC_PATH 00:30:07.830 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:07.830 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:07.830 #undef SPDK_CONFIG_FUSE 00:30:07.830 #undef SPDK_CONFIG_FUZZER 00:30:07.830 #define SPDK_CONFIG_FUZZER_LIB 00:30:07.830 #undef SPDK_CONFIG_GOLANG 00:30:07.830 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:07.830 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:07.830 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:07.830 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:07.830 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:07.830 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:07.830 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:07.830 #define SPDK_CONFIG_IDXD 1 00:30:07.830 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:07.830 #define SPDK_CONFIG_IPSEC_MB 1 00:30:07.830 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:07.830 #define SPDK_CONFIG_ISAL 1 00:30:07.830 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:07.830 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:07.830 #define SPDK_CONFIG_LIBDIR 00:30:07.830 #undef SPDK_CONFIG_LTO 00:30:07.830 #define SPDK_CONFIG_MAX_LCORES 128 00:30:07.830 #define SPDK_CONFIG_NVME_CUSE 1 00:30:07.830 #undef SPDK_CONFIG_OCF 00:30:07.830 #define SPDK_CONFIG_OCF_PATH 00:30:07.830 #define SPDK_CONFIG_OPENSSL_PATH 00:30:07.830 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:07.830 #define SPDK_CONFIG_PGO_DIR 00:30:07.830 #undef SPDK_CONFIG_PGO_USE 00:30:07.830 #define SPDK_CONFIG_PREFIX /usr/local 00:30:07.830 #undef SPDK_CONFIG_RAID5F 00:30:07.830 #undef SPDK_CONFIG_RBD 00:30:07.830 #define SPDK_CONFIG_RDMA 1 00:30:07.830 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:07.830 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:07.830 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:07.830 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:07.830 #define SPDK_CONFIG_SHARED 1 00:30:07.830 #undef SPDK_CONFIG_SMA 00:30:07.830 #define SPDK_CONFIG_TESTS 1 00:30:07.830 #undef SPDK_CONFIG_TSAN 00:30:07.830 #define SPDK_CONFIG_UBLK 1 00:30:07.830 #define SPDK_CONFIG_UBSAN 1 00:30:07.830 #undef SPDK_CONFIG_UNIT_TESTS 00:30:07.830 #undef SPDK_CONFIG_URING 00:30:07.830 #define SPDK_CONFIG_URING_PATH 00:30:07.830 #undef SPDK_CONFIG_URING_ZNS 00:30:07.830 #undef SPDK_CONFIG_USDT 00:30:07.830 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:07.830 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:07.830 #undef SPDK_CONFIG_VFIO_USER 00:30:07.830 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:07.830 #define SPDK_CONFIG_VHOST 1 00:30:07.830 #define SPDK_CONFIG_VIRTIO 1 00:30:07.830 #undef SPDK_CONFIG_VTUNE 00:30:07.830 #define SPDK_CONFIG_VTUNE_DIR 00:30:07.830 #define SPDK_CONFIG_WERROR 1 00:30:07.830 #define SPDK_CONFIG_WPDK_DIR 00:30:07.830 #undef SPDK_CONFIG_XNVME 00:30:07.830 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:07.830 07:35:40 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:07.830 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:07.830 07:35:40 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:07.830 07:35:40 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:07.830 07:35:40 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:07.830 07:35:40 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:07.830 07:35:40 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:07.830 07:35:40 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:07.830 07:35:40 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:07.830 07:35:40 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:07.830 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:07.830 07:35:40 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:07.830 07:35:40 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:07.830 07:35:40 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:07.830 07:35:40 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:07.830 07:35:40 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:07.831 07:35:40 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:07.831 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 1 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@173 -- # : 0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@202 -- # cat 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@265 -- # export valgrind= 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@265 -- # valgrind= 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@271 -- # uname -s 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@281 -- # MAKE=make 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j112 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@301 -- # TEST_MODE= 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@320 -- # [[ -z 1786786 ]] 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@320 -- # kill -0 1786786 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local mount target_dir 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:30:07.832 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.qitdNl 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.qitdNl/tests/interrupt /tmp/spdk.qitdNl 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@329 -- # df -T 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:30:07.833 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=954302464 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4330127360 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=55081975808 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=61742305280 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=6660329472 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=30866341888 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871150592 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4808704 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=12338696192 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=12348461056 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=9764864 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=30870024192 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871154688 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=1130496 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=6174224384 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=6174228480 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:30:08.092 * Looking for test storage... 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@370 -- # local target_space new_size 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@374 -- # mount=/ 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@376 -- # target_space=55081975808 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@383 -- # new_size=8874921984 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.092 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@391 -- # return 0 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:08.092 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1786922 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:08.093 07:35:40 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1786922 /var/tmp/spdk.sock 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@831 -- # '[' -z 1786922 ']' 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:08.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:08.093 07:35:40 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:08.093 [2024-07-25 07:35:40.431849] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:30:08.093 [2024-07-25 07:35:40.431911] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1786922 ] 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:08.093 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:08.093 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:08.093 [2024-07-25 07:35:40.563497] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:08.352 [2024-07-25 07:35:40.650688] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:30:08.352 [2024-07-25 07:35:40.650780] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:30:08.352 [2024-07-25 07:35:40.650784] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:30:08.352 [2024-07-25 07:35:40.718581] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:08.919 07:35:41 reap_unregistered_poller -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:08.919 07:35:41 reap_unregistered_poller -- common/autotest_common.sh@864 -- # return 0 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:08.919 07:35:41 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:08.919 07:35:41 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:08.919 07:35:41 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:08.919 "name": "app_thread", 00:30:08.919 "id": 1, 00:30:08.919 "active_pollers": [], 00:30:08.919 "timed_pollers": [ 00:30:08.919 { 00:30:08.919 "name": "rpc_subsystem_poll_servers", 00:30:08.919 "id": 1, 00:30:08.919 "state": "waiting", 00:30:08.919 "run_count": 0, 00:30:08.919 "busy_count": 0, 00:30:08.919 "period_ticks": 10000000 00:30:08.919 } 00:30:08.919 ], 00:30:08.919 "paused_pollers": [] 00:30:08.919 }' 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:08.919 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:09.177 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:09.177 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:09.177 07:35:41 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:09.177 07:35:41 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:09.177 07:35:41 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:09.177 5000+0 records in 00:30:09.177 5000+0 records out 00:30:09.177 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0264041 s, 388 MB/s 00:30:09.177 07:35:41 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:09.177 AIO0 00:30:09.435 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:09.435 07:35:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:09.693 "name": "app_thread", 00:30:09.693 "id": 1, 00:30:09.693 "active_pollers": [], 00:30:09.693 "timed_pollers": [ 00:30:09.693 { 00:30:09.693 "name": "rpc_subsystem_poll_servers", 00:30:09.693 "id": 1, 00:30:09.693 "state": "waiting", 00:30:09.693 "run_count": 0, 00:30:09.693 "busy_count": 0, 00:30:09.693 "period_ticks": 10000000 00:30:09.693 } 00:30:09.693 ], 00:30:09.693 "paused_pollers": [] 00:30:09.693 }' 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:09.693 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1786922 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@950 -- # '[' -z 1786922 ']' 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@954 -- # kill -0 1786922 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@955 -- # uname 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:09.693 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1786922 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1786922' 00:30:09.951 killing process with pid 1786922 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@969 -- # kill 1786922 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@974 -- # wait 1786922 00:30:09.951 07:35:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:09.951 07:35:42 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:09.951 00:30:09.951 real 0m2.335s 00:30:09.951 user 0m1.411s 00:30:09.951 sys 0m0.651s 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:09.951 07:35:42 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:09.951 ************************************ 00:30:09.951 END TEST reap_unregistered_poller 00:30:09.951 ************************************ 00:30:10.208 07:35:42 -- spdk/autotest.sh@202 -- # uname -s 00:30:10.209 07:35:42 -- spdk/autotest.sh@202 -- # [[ Linux == Linux ]] 00:30:10.209 07:35:42 -- spdk/autotest.sh@203 -- # [[ 1 -eq 1 ]] 00:30:10.209 07:35:42 -- spdk/autotest.sh@209 -- # [[ 1 -eq 0 ]] 00:30:10.209 07:35:42 -- spdk/autotest.sh@215 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@264 -- # timing_exit lib 00:30:10.209 07:35:42 -- common/autotest_common.sh@730 -- # xtrace_disable 00:30:10.209 07:35:42 -- common/autotest_common.sh@10 -- # set +x 00:30:10.209 07:35:42 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@283 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@347 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@351 -- # '[' 1 -eq 1 ']' 00:30:10.209 07:35:42 -- spdk/autotest.sh@352 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:10.209 07:35:42 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:30:10.209 07:35:42 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:10.209 07:35:42 -- common/autotest_common.sh@10 -- # set +x 00:30:10.209 ************************************ 00:30:10.209 START TEST compress_compdev 00:30:10.209 ************************************ 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:10.209 * Looking for test storage... 00:30:10.209 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:10.209 07:35:42 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:10.209 07:35:42 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:10.209 07:35:42 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:10.209 07:35:42 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.209 07:35:42 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.209 07:35:42 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.209 07:35:42 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:10.209 07:35:42 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:10.209 07:35:42 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1787308 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1787308 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1787308 ']' 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:10.209 07:35:42 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:10.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:10.209 07:35:42 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:10.467 [2024-07-25 07:35:42.776272] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:30:10.467 [2024-07-25 07:35:42.776337] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1787308 ] 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:10.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.467 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:10.468 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:10.468 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:10.468 [2024-07-25 07:35:42.899387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:10.468 [2024-07-25 07:35:42.984788] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:30:10.468 [2024-07-25 07:35:42.984795] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:30:11.398 [2024-07-25 07:35:43.668062] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:11.398 07:35:43 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:11.398 07:35:43 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:30:11.398 07:35:43 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:11.398 07:35:43 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:11.398 07:35:43 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:14.675 [2024-07-25 07:35:46.808549] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x234efc0 PMD being used: compress_qat 00:30:14.675 07:35:46 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:14.675 07:35:46 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:30:14.675 07:35:46 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:14.675 07:35:46 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:14.675 07:35:46 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:14.675 07:35:46 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:14.675 07:35:46 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:14.675 07:35:47 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:14.933 [ 00:30:14.933 { 00:30:14.933 "name": "Nvme0n1", 00:30:14.933 "aliases": [ 00:30:14.933 "eb063859-3754-4834-a9c4-4c5a24554582" 00:30:14.933 ], 00:30:14.933 "product_name": "NVMe disk", 00:30:14.933 "block_size": 512, 00:30:14.933 "num_blocks": 3907029168, 00:30:14.933 "uuid": "eb063859-3754-4834-a9c4-4c5a24554582", 00:30:14.933 "assigned_rate_limits": { 00:30:14.933 "rw_ios_per_sec": 0, 00:30:14.933 "rw_mbytes_per_sec": 0, 00:30:14.933 "r_mbytes_per_sec": 0, 00:30:14.933 "w_mbytes_per_sec": 0 00:30:14.933 }, 00:30:14.933 "claimed": false, 00:30:14.933 "zoned": false, 00:30:14.933 "supported_io_types": { 00:30:14.933 "read": true, 00:30:14.933 "write": true, 00:30:14.933 "unmap": true, 00:30:14.933 "flush": true, 00:30:14.933 "reset": true, 00:30:14.933 "nvme_admin": true, 00:30:14.933 "nvme_io": true, 00:30:14.933 "nvme_io_md": false, 00:30:14.933 "write_zeroes": true, 00:30:14.933 "zcopy": false, 00:30:14.933 "get_zone_info": false, 00:30:14.933 "zone_management": false, 00:30:14.933 "zone_append": false, 00:30:14.933 "compare": false, 00:30:14.933 "compare_and_write": false, 00:30:14.933 "abort": true, 00:30:14.933 "seek_hole": false, 00:30:14.933 "seek_data": false, 00:30:14.933 "copy": false, 00:30:14.933 "nvme_iov_md": false 00:30:14.933 }, 00:30:14.933 "driver_specific": { 00:30:14.933 "nvme": [ 00:30:14.933 { 00:30:14.933 "pci_address": "0000:d8:00.0", 00:30:14.933 "trid": { 00:30:14.933 "trtype": "PCIe", 00:30:14.933 "traddr": "0000:d8:00.0" 00:30:14.933 }, 00:30:14.933 "ctrlr_data": { 00:30:14.933 "cntlid": 0, 00:30:14.933 "vendor_id": "0x8086", 00:30:14.933 "model_number": "INTEL SSDPE2KX020T8", 00:30:14.933 "serial_number": "BTLJ125505KA2P0BGN", 00:30:14.933 "firmware_revision": "VDV10170", 00:30:14.933 "oacs": { 00:30:14.933 "security": 0, 00:30:14.933 "format": 1, 00:30:14.933 "firmware": 1, 00:30:14.933 "ns_manage": 1 00:30:14.933 }, 00:30:14.933 "multi_ctrlr": false, 00:30:14.933 "ana_reporting": false 00:30:14.933 }, 00:30:14.933 "vs": { 00:30:14.933 "nvme_version": "1.2" 00:30:14.933 }, 00:30:14.933 "ns_data": { 00:30:14.933 "id": 1, 00:30:14.933 "can_share": false 00:30:14.933 } 00:30:14.933 } 00:30:14.933 ], 00:30:14.933 "mp_policy": "active_passive" 00:30:14.933 } 00:30:14.933 } 00:30:14.933 ] 00:30:14.933 07:35:47 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:14.933 07:35:47 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:15.196 [2024-07-25 07:35:47.497695] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x234f930 PMD being used: compress_qat 00:30:16.127 32cda169-20d5-4a5a-be40-34a0218eb4df 00:30:16.127 07:35:48 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:16.385 baf6747f-7848-4dbe-b08e-331b1d5c1826 00:30:16.385 07:35:48 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:16.385 07:35:48 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:30:16.385 07:35:48 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:16.385 07:35:48 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:16.385 07:35:48 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:16.385 07:35:48 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:16.385 07:35:48 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:16.642 07:35:49 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:16.901 [ 00:30:16.901 { 00:30:16.901 "name": "baf6747f-7848-4dbe-b08e-331b1d5c1826", 00:30:16.901 "aliases": [ 00:30:16.901 "lvs0/lv0" 00:30:16.901 ], 00:30:16.901 "product_name": "Logical Volume", 00:30:16.901 "block_size": 512, 00:30:16.901 "num_blocks": 204800, 00:30:16.901 "uuid": "baf6747f-7848-4dbe-b08e-331b1d5c1826", 00:30:16.901 "assigned_rate_limits": { 00:30:16.901 "rw_ios_per_sec": 0, 00:30:16.901 "rw_mbytes_per_sec": 0, 00:30:16.901 "r_mbytes_per_sec": 0, 00:30:16.901 "w_mbytes_per_sec": 0 00:30:16.901 }, 00:30:16.901 "claimed": false, 00:30:16.901 "zoned": false, 00:30:16.901 "supported_io_types": { 00:30:16.901 "read": true, 00:30:16.901 "write": true, 00:30:16.901 "unmap": true, 00:30:16.901 "flush": false, 00:30:16.901 "reset": true, 00:30:16.901 "nvme_admin": false, 00:30:16.901 "nvme_io": false, 00:30:16.901 "nvme_io_md": false, 00:30:16.901 "write_zeroes": true, 00:30:16.901 "zcopy": false, 00:30:16.901 "get_zone_info": false, 00:30:16.901 "zone_management": false, 00:30:16.901 "zone_append": false, 00:30:16.901 "compare": false, 00:30:16.901 "compare_and_write": false, 00:30:16.901 "abort": false, 00:30:16.901 "seek_hole": true, 00:30:16.901 "seek_data": true, 00:30:16.901 "copy": false, 00:30:16.901 "nvme_iov_md": false 00:30:16.901 }, 00:30:16.901 "driver_specific": { 00:30:16.901 "lvol": { 00:30:16.902 "lvol_store_uuid": "32cda169-20d5-4a5a-be40-34a0218eb4df", 00:30:16.902 "base_bdev": "Nvme0n1", 00:30:16.902 "thin_provision": true, 00:30:16.902 "num_allocated_clusters": 0, 00:30:16.902 "snapshot": false, 00:30:16.902 "clone": false, 00:30:16.902 "esnap_clone": false 00:30:16.902 } 00:30:16.902 } 00:30:16.902 } 00:30:16.902 ] 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:16.902 07:35:49 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:16.902 07:35:49 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:16.902 [2024-07-25 07:35:49.384608] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:16.902 COMP_lvs0/lv0 00:30:16.902 07:35:49 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:16.902 07:35:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:17.193 07:35:49 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:17.451 [ 00:30:17.451 { 00:30:17.451 "name": "COMP_lvs0/lv0", 00:30:17.451 "aliases": [ 00:30:17.451 "c3a20314-78b0-53f6-883a-8f745afe6624" 00:30:17.451 ], 00:30:17.451 "product_name": "compress", 00:30:17.451 "block_size": 512, 00:30:17.451 "num_blocks": 200704, 00:30:17.451 "uuid": "c3a20314-78b0-53f6-883a-8f745afe6624", 00:30:17.451 "assigned_rate_limits": { 00:30:17.451 "rw_ios_per_sec": 0, 00:30:17.451 "rw_mbytes_per_sec": 0, 00:30:17.451 "r_mbytes_per_sec": 0, 00:30:17.451 "w_mbytes_per_sec": 0 00:30:17.451 }, 00:30:17.451 "claimed": false, 00:30:17.451 "zoned": false, 00:30:17.451 "supported_io_types": { 00:30:17.451 "read": true, 00:30:17.451 "write": true, 00:30:17.451 "unmap": false, 00:30:17.451 "flush": false, 00:30:17.451 "reset": false, 00:30:17.451 "nvme_admin": false, 00:30:17.451 "nvme_io": false, 00:30:17.451 "nvme_io_md": false, 00:30:17.451 "write_zeroes": true, 00:30:17.451 "zcopy": false, 00:30:17.451 "get_zone_info": false, 00:30:17.451 "zone_management": false, 00:30:17.451 "zone_append": false, 00:30:17.451 "compare": false, 00:30:17.451 "compare_and_write": false, 00:30:17.451 "abort": false, 00:30:17.451 "seek_hole": false, 00:30:17.451 "seek_data": false, 00:30:17.451 "copy": false, 00:30:17.451 "nvme_iov_md": false 00:30:17.451 }, 00:30:17.451 "driver_specific": { 00:30:17.451 "compress": { 00:30:17.451 "name": "COMP_lvs0/lv0", 00:30:17.451 "base_bdev_name": "baf6747f-7848-4dbe-b08e-331b1d5c1826", 00:30:17.451 "pm_path": "/tmp/pmem/09915a4b-44de-4667-819b-3298cba21bb4" 00:30:17.451 } 00:30:17.451 } 00:30:17.451 } 00:30:17.451 ] 00:30:17.451 07:35:49 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:17.451 07:35:49 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:17.708 [2024-07-25 07:35:49.995034] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f25941b15c0 PMD being used: compress_qat 00:30:17.708 [2024-07-25 07:35:49.997102] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x234b7a0 PMD being used: compress_qat 00:30:17.708 Running I/O for 3 seconds... 00:30:20.987 00:30:20.987 Latency(us) 00:30:20.987 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:20.987 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:20.987 Verification LBA range: start 0x0 length 0x3100 00:30:20.987 COMP_lvs0/lv0 : 3.00 4228.61 16.52 0.00 0.00 7522.30 129.43 12582.91 00:30:20.987 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:20.987 Verification LBA range: start 0x3100 length 0x3100 00:30:20.987 COMP_lvs0/lv0 : 3.00 4310.53 16.84 0.00 0.00 7389.26 122.06 12373.20 00:30:20.987 =================================================================================================================== 00:30:20.987 Total : 8539.14 33.36 0.00 0.00 7455.13 122.06 12582.91 00:30:20.987 0 00:30:20.987 07:35:53 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:20.987 07:35:53 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:20.987 07:35:53 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:20.987 07:35:53 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:20.987 07:35:53 compress_compdev -- compress/compress.sh@78 -- # killprocess 1787308 00:30:20.987 07:35:53 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1787308 ']' 00:30:20.987 07:35:53 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1787308 00:30:20.987 07:35:53 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:30:20.987 07:35:53 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:20.987 07:35:53 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1787308 00:30:21.244 07:35:53 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:30:21.244 07:35:53 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:30:21.244 07:35:53 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1787308' 00:30:21.244 killing process with pid 1787308 00:30:21.244 07:35:53 compress_compdev -- common/autotest_common.sh@969 -- # kill 1787308 00:30:21.244 Received shutdown signal, test time was about 3.000000 seconds 00:30:21.244 00:30:21.244 Latency(us) 00:30:21.244 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:21.244 =================================================================================================================== 00:30:21.244 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:21.244 07:35:53 compress_compdev -- common/autotest_common.sh@974 -- # wait 1787308 00:30:23.769 07:35:55 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:23.769 07:35:55 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:23.769 07:35:55 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1789442 00:30:23.769 07:35:55 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:23.769 07:35:55 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:23.769 07:35:55 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1789442 00:30:23.769 07:35:55 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1789442 ']' 00:30:23.769 07:35:55 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:23.769 07:35:55 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:23.769 07:35:55 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:23.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:23.769 07:35:55 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:23.769 07:35:55 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:23.769 [2024-07-25 07:35:55.937973] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:30:23.769 [2024-07-25 07:35:55.938035] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1789442 ] 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:23.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:23.769 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:23.769 [2024-07-25 07:35:56.057656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:23.769 [2024-07-25 07:35:56.144369] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:30:23.769 [2024-07-25 07:35:56.144376] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:30:24.334 [2024-07-25 07:35:56.828241] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:24.592 07:35:56 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:24.592 07:35:56 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:30:24.592 07:35:56 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:30:24.592 07:35:56 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:24.592 07:35:56 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:27.868 [2024-07-25 07:35:59.977494] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x108dfc0 PMD being used: compress_qat 00:30:27.868 07:36:00 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:27.868 07:36:00 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:28.126 [ 00:30:28.126 { 00:30:28.126 "name": "Nvme0n1", 00:30:28.126 "aliases": [ 00:30:28.126 "a60b70ae-19fa-4391-b488-aba1a9c61ef6" 00:30:28.126 ], 00:30:28.126 "product_name": "NVMe disk", 00:30:28.126 "block_size": 512, 00:30:28.126 "num_blocks": 3907029168, 00:30:28.126 "uuid": "a60b70ae-19fa-4391-b488-aba1a9c61ef6", 00:30:28.126 "assigned_rate_limits": { 00:30:28.126 "rw_ios_per_sec": 0, 00:30:28.126 "rw_mbytes_per_sec": 0, 00:30:28.126 "r_mbytes_per_sec": 0, 00:30:28.126 "w_mbytes_per_sec": 0 00:30:28.126 }, 00:30:28.126 "claimed": false, 00:30:28.126 "zoned": false, 00:30:28.126 "supported_io_types": { 00:30:28.126 "read": true, 00:30:28.126 "write": true, 00:30:28.126 "unmap": true, 00:30:28.126 "flush": true, 00:30:28.126 "reset": true, 00:30:28.126 "nvme_admin": true, 00:30:28.126 "nvme_io": true, 00:30:28.126 "nvme_io_md": false, 00:30:28.126 "write_zeroes": true, 00:30:28.126 "zcopy": false, 00:30:28.126 "get_zone_info": false, 00:30:28.126 "zone_management": false, 00:30:28.126 "zone_append": false, 00:30:28.126 "compare": false, 00:30:28.126 "compare_and_write": false, 00:30:28.126 "abort": true, 00:30:28.126 "seek_hole": false, 00:30:28.126 "seek_data": false, 00:30:28.126 "copy": false, 00:30:28.126 "nvme_iov_md": false 00:30:28.126 }, 00:30:28.126 "driver_specific": { 00:30:28.126 "nvme": [ 00:30:28.126 { 00:30:28.126 "pci_address": "0000:d8:00.0", 00:30:28.126 "trid": { 00:30:28.126 "trtype": "PCIe", 00:30:28.126 "traddr": "0000:d8:00.0" 00:30:28.126 }, 00:30:28.126 "ctrlr_data": { 00:30:28.126 "cntlid": 0, 00:30:28.126 "vendor_id": "0x8086", 00:30:28.126 "model_number": "INTEL SSDPE2KX020T8", 00:30:28.126 "serial_number": "BTLJ125505KA2P0BGN", 00:30:28.126 "firmware_revision": "VDV10170", 00:30:28.126 "oacs": { 00:30:28.126 "security": 0, 00:30:28.126 "format": 1, 00:30:28.126 "firmware": 1, 00:30:28.126 "ns_manage": 1 00:30:28.126 }, 00:30:28.126 "multi_ctrlr": false, 00:30:28.126 "ana_reporting": false 00:30:28.126 }, 00:30:28.126 "vs": { 00:30:28.126 "nvme_version": "1.2" 00:30:28.126 }, 00:30:28.126 "ns_data": { 00:30:28.126 "id": 1, 00:30:28.126 "can_share": false 00:30:28.126 } 00:30:28.126 } 00:30:28.126 ], 00:30:28.126 "mp_policy": "active_passive" 00:30:28.126 } 00:30:28.126 } 00:30:28.126 ] 00:30:28.126 07:36:00 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:28.126 07:36:00 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:28.384 [2024-07-25 07:36:00.662611] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xec52a0 PMD being used: compress_qat 00:30:29.316 68a01480-1fda-41cc-84df-0155d20550ad 00:30:29.316 07:36:01 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:29.573 d3f94e31-a4fb-4363-ad91-423962d2e9be 00:30:29.573 07:36:01 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:29.573 07:36:01 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:30:29.573 07:36:01 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:29.573 07:36:01 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:29.573 07:36:01 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:29.573 07:36:01 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:29.573 07:36:01 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:29.830 07:36:02 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:29.830 [ 00:30:29.830 { 00:30:29.830 "name": "d3f94e31-a4fb-4363-ad91-423962d2e9be", 00:30:29.830 "aliases": [ 00:30:29.830 "lvs0/lv0" 00:30:29.830 ], 00:30:29.830 "product_name": "Logical Volume", 00:30:29.830 "block_size": 512, 00:30:29.830 "num_blocks": 204800, 00:30:29.830 "uuid": "d3f94e31-a4fb-4363-ad91-423962d2e9be", 00:30:29.830 "assigned_rate_limits": { 00:30:29.830 "rw_ios_per_sec": 0, 00:30:29.830 "rw_mbytes_per_sec": 0, 00:30:29.830 "r_mbytes_per_sec": 0, 00:30:29.830 "w_mbytes_per_sec": 0 00:30:29.830 }, 00:30:29.830 "claimed": false, 00:30:29.830 "zoned": false, 00:30:29.830 "supported_io_types": { 00:30:29.830 "read": true, 00:30:29.830 "write": true, 00:30:29.830 "unmap": true, 00:30:29.830 "flush": false, 00:30:29.830 "reset": true, 00:30:29.830 "nvme_admin": false, 00:30:29.830 "nvme_io": false, 00:30:29.830 "nvme_io_md": false, 00:30:29.830 "write_zeroes": true, 00:30:29.830 "zcopy": false, 00:30:29.830 "get_zone_info": false, 00:30:29.830 "zone_management": false, 00:30:29.830 "zone_append": false, 00:30:29.830 "compare": false, 00:30:29.830 "compare_and_write": false, 00:30:29.830 "abort": false, 00:30:29.830 "seek_hole": true, 00:30:29.830 "seek_data": true, 00:30:29.830 "copy": false, 00:30:29.830 "nvme_iov_md": false 00:30:29.830 }, 00:30:29.830 "driver_specific": { 00:30:29.830 "lvol": { 00:30:29.830 "lvol_store_uuid": "68a01480-1fda-41cc-84df-0155d20550ad", 00:30:29.830 "base_bdev": "Nvme0n1", 00:30:29.830 "thin_provision": true, 00:30:29.830 "num_allocated_clusters": 0, 00:30:29.830 "snapshot": false, 00:30:29.830 "clone": false, 00:30:29.830 "esnap_clone": false 00:30:29.830 } 00:30:29.830 } 00:30:29.830 } 00:30:29.830 ] 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:30.087 07:36:02 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:30.087 07:36:02 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:30.087 [2024-07-25 07:36:02.600075] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:30.087 COMP_lvs0/lv0 00:30:30.087 07:36:02 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:30.087 07:36:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:30.344 07:36:02 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:30.601 [ 00:30:30.601 { 00:30:30.601 "name": "COMP_lvs0/lv0", 00:30:30.601 "aliases": [ 00:30:30.601 "2ede18f4-fb5c-5ed2-a5d0-e2ca7c50b474" 00:30:30.601 ], 00:30:30.601 "product_name": "compress", 00:30:30.601 "block_size": 512, 00:30:30.601 "num_blocks": 200704, 00:30:30.601 "uuid": "2ede18f4-fb5c-5ed2-a5d0-e2ca7c50b474", 00:30:30.601 "assigned_rate_limits": { 00:30:30.601 "rw_ios_per_sec": 0, 00:30:30.601 "rw_mbytes_per_sec": 0, 00:30:30.601 "r_mbytes_per_sec": 0, 00:30:30.601 "w_mbytes_per_sec": 0 00:30:30.601 }, 00:30:30.601 "claimed": false, 00:30:30.601 "zoned": false, 00:30:30.601 "supported_io_types": { 00:30:30.601 "read": true, 00:30:30.601 "write": true, 00:30:30.601 "unmap": false, 00:30:30.601 "flush": false, 00:30:30.601 "reset": false, 00:30:30.601 "nvme_admin": false, 00:30:30.601 "nvme_io": false, 00:30:30.601 "nvme_io_md": false, 00:30:30.601 "write_zeroes": true, 00:30:30.601 "zcopy": false, 00:30:30.601 "get_zone_info": false, 00:30:30.601 "zone_management": false, 00:30:30.601 "zone_append": false, 00:30:30.601 "compare": false, 00:30:30.601 "compare_and_write": false, 00:30:30.601 "abort": false, 00:30:30.601 "seek_hole": false, 00:30:30.601 "seek_data": false, 00:30:30.601 "copy": false, 00:30:30.601 "nvme_iov_md": false 00:30:30.601 }, 00:30:30.601 "driver_specific": { 00:30:30.601 "compress": { 00:30:30.601 "name": "COMP_lvs0/lv0", 00:30:30.601 "base_bdev_name": "d3f94e31-a4fb-4363-ad91-423962d2e9be", 00:30:30.601 "pm_path": "/tmp/pmem/e9795412-94bd-4bd3-8f55-a766afcf940f" 00:30:30.601 } 00:30:30.601 } 00:30:30.601 } 00:30:30.601 ] 00:30:30.601 07:36:03 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:30.601 07:36:03 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:30.858 [2024-07-25 07:36:03.158240] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f16d41b15c0 PMD being used: compress_qat 00:30:30.858 [2024-07-25 07:36:03.160308] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x108a950 PMD being used: compress_qat 00:30:30.858 Running I/O for 3 seconds... 00:30:34.131 00:30:34.131 Latency(us) 00:30:34.131 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:34.131 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:34.131 Verification LBA range: start 0x0 length 0x3100 00:30:34.131 COMP_lvs0/lv0 : 3.01 4077.64 15.93 0.00 0.00 7787.86 128.61 15204.35 00:30:34.131 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:34.131 Verification LBA range: start 0x3100 length 0x3100 00:30:34.131 COMP_lvs0/lv0 : 3.01 4210.48 16.45 0.00 0.00 7558.01 122.06 13946.06 00:30:34.131 =================================================================================================================== 00:30:34.131 Total : 8288.12 32.38 0.00 0.00 7671.17 122.06 15204.35 00:30:34.131 0 00:30:34.131 07:36:06 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:34.131 07:36:06 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:34.131 07:36:06 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:34.131 07:36:06 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:34.131 07:36:06 compress_compdev -- compress/compress.sh@78 -- # killprocess 1789442 00:30:34.131 07:36:06 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1789442 ']' 00:30:34.131 07:36:06 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1789442 00:30:34.131 07:36:06 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:30:34.131 07:36:06 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:34.131 07:36:06 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1789442 00:30:34.427 07:36:06 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:30:34.427 07:36:06 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:30:34.427 07:36:06 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1789442' 00:30:34.427 killing process with pid 1789442 00:30:34.427 07:36:06 compress_compdev -- common/autotest_common.sh@969 -- # kill 1789442 00:30:34.427 Received shutdown signal, test time was about 3.000000 seconds 00:30:34.427 00:30:34.427 Latency(us) 00:30:34.427 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:34.427 =================================================================================================================== 00:30:34.427 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:34.427 07:36:06 compress_compdev -- common/autotest_common.sh@974 -- # wait 1789442 00:30:36.951 07:36:09 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:36.951 07:36:09 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:36.951 07:36:09 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1792133 00:30:36.951 07:36:09 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:36.951 07:36:09 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:36.951 07:36:09 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1792133 00:30:36.951 07:36:09 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1792133 ']' 00:30:36.951 07:36:09 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:36.951 07:36:09 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:36.951 07:36:09 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:36.951 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:36.951 07:36:09 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:36.951 07:36:09 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:36.951 [2024-07-25 07:36:09.106951] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:30:36.951 [2024-07-25 07:36:09.107016] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1792133 ] 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:36.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.951 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:36.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:36.952 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:36.952 [2024-07-25 07:36:09.228515] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:36.952 [2024-07-25 07:36:09.315649] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:30:36.952 [2024-07-25 07:36:09.315654] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:30:37.518 [2024-07-25 07:36:09.994766] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:37.518 07:36:10 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:37.518 07:36:10 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:30:37.518 07:36:10 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:37.518 07:36:10 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:37.518 07:36:10 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:40.801 [2024-07-25 07:36:13.144505] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fdcfc0 PMD being used: compress_qat 00:30:40.801 07:36:13 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:40.801 07:36:13 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:30:40.801 07:36:13 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:40.801 07:36:13 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:40.801 07:36:13 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:40.801 07:36:13 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:40.801 07:36:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:41.057 07:36:13 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:41.315 [ 00:30:41.315 { 00:30:41.315 "name": "Nvme0n1", 00:30:41.315 "aliases": [ 00:30:41.315 "0c196480-e40b-45b5-b0b3-4ac736dd83fc" 00:30:41.315 ], 00:30:41.315 "product_name": "NVMe disk", 00:30:41.315 "block_size": 512, 00:30:41.315 "num_blocks": 3907029168, 00:30:41.315 "uuid": "0c196480-e40b-45b5-b0b3-4ac736dd83fc", 00:30:41.315 "assigned_rate_limits": { 00:30:41.315 "rw_ios_per_sec": 0, 00:30:41.315 "rw_mbytes_per_sec": 0, 00:30:41.315 "r_mbytes_per_sec": 0, 00:30:41.315 "w_mbytes_per_sec": 0 00:30:41.315 }, 00:30:41.315 "claimed": false, 00:30:41.315 "zoned": false, 00:30:41.315 "supported_io_types": { 00:30:41.315 "read": true, 00:30:41.315 "write": true, 00:30:41.315 "unmap": true, 00:30:41.315 "flush": true, 00:30:41.315 "reset": true, 00:30:41.315 "nvme_admin": true, 00:30:41.315 "nvme_io": true, 00:30:41.315 "nvme_io_md": false, 00:30:41.315 "write_zeroes": true, 00:30:41.315 "zcopy": false, 00:30:41.315 "get_zone_info": false, 00:30:41.315 "zone_management": false, 00:30:41.315 "zone_append": false, 00:30:41.315 "compare": false, 00:30:41.315 "compare_and_write": false, 00:30:41.315 "abort": true, 00:30:41.315 "seek_hole": false, 00:30:41.315 "seek_data": false, 00:30:41.315 "copy": false, 00:30:41.315 "nvme_iov_md": false 00:30:41.315 }, 00:30:41.315 "driver_specific": { 00:30:41.315 "nvme": [ 00:30:41.315 { 00:30:41.315 "pci_address": "0000:d8:00.0", 00:30:41.315 "trid": { 00:30:41.315 "trtype": "PCIe", 00:30:41.315 "traddr": "0000:d8:00.0" 00:30:41.315 }, 00:30:41.315 "ctrlr_data": { 00:30:41.315 "cntlid": 0, 00:30:41.315 "vendor_id": "0x8086", 00:30:41.315 "model_number": "INTEL SSDPE2KX020T8", 00:30:41.315 "serial_number": "BTLJ125505KA2P0BGN", 00:30:41.315 "firmware_revision": "VDV10170", 00:30:41.315 "oacs": { 00:30:41.315 "security": 0, 00:30:41.315 "format": 1, 00:30:41.315 "firmware": 1, 00:30:41.315 "ns_manage": 1 00:30:41.315 }, 00:30:41.315 "multi_ctrlr": false, 00:30:41.315 "ana_reporting": false 00:30:41.315 }, 00:30:41.315 "vs": { 00:30:41.315 "nvme_version": "1.2" 00:30:41.316 }, 00:30:41.316 "ns_data": { 00:30:41.316 "id": 1, 00:30:41.316 "can_share": false 00:30:41.316 } 00:30:41.316 } 00:30:41.316 ], 00:30:41.316 "mp_policy": "active_passive" 00:30:41.316 } 00:30:41.316 } 00:30:41.316 ] 00:30:41.316 07:36:13 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:41.316 07:36:13 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:41.316 [2024-07-25 07:36:13.837219] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fdd930 PMD being used: compress_qat 00:30:42.690 9c7ec3c9-28d4-4589-96dc-8fcd0ca12a91 00:30:42.690 07:36:14 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:42.690 1d669351-0609-4cd9-a7f7-6554d40e5842 00:30:42.690 07:36:15 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:42.690 07:36:15 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:30:42.690 07:36:15 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:42.690 07:36:15 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:42.690 07:36:15 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:42.690 07:36:15 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:42.690 07:36:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:42.948 07:36:15 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:43.207 [ 00:30:43.207 { 00:30:43.207 "name": "1d669351-0609-4cd9-a7f7-6554d40e5842", 00:30:43.207 "aliases": [ 00:30:43.207 "lvs0/lv0" 00:30:43.207 ], 00:30:43.207 "product_name": "Logical Volume", 00:30:43.207 "block_size": 512, 00:30:43.207 "num_blocks": 204800, 00:30:43.207 "uuid": "1d669351-0609-4cd9-a7f7-6554d40e5842", 00:30:43.207 "assigned_rate_limits": { 00:30:43.207 "rw_ios_per_sec": 0, 00:30:43.207 "rw_mbytes_per_sec": 0, 00:30:43.207 "r_mbytes_per_sec": 0, 00:30:43.207 "w_mbytes_per_sec": 0 00:30:43.207 }, 00:30:43.207 "claimed": false, 00:30:43.207 "zoned": false, 00:30:43.207 "supported_io_types": { 00:30:43.207 "read": true, 00:30:43.207 "write": true, 00:30:43.207 "unmap": true, 00:30:43.207 "flush": false, 00:30:43.207 "reset": true, 00:30:43.207 "nvme_admin": false, 00:30:43.207 "nvme_io": false, 00:30:43.207 "nvme_io_md": false, 00:30:43.207 "write_zeroes": true, 00:30:43.207 "zcopy": false, 00:30:43.207 "get_zone_info": false, 00:30:43.207 "zone_management": false, 00:30:43.207 "zone_append": false, 00:30:43.207 "compare": false, 00:30:43.207 "compare_and_write": false, 00:30:43.207 "abort": false, 00:30:43.207 "seek_hole": true, 00:30:43.207 "seek_data": true, 00:30:43.207 "copy": false, 00:30:43.207 "nvme_iov_md": false 00:30:43.207 }, 00:30:43.207 "driver_specific": { 00:30:43.207 "lvol": { 00:30:43.207 "lvol_store_uuid": "9c7ec3c9-28d4-4589-96dc-8fcd0ca12a91", 00:30:43.207 "base_bdev": "Nvme0n1", 00:30:43.207 "thin_provision": true, 00:30:43.207 "num_allocated_clusters": 0, 00:30:43.207 "snapshot": false, 00:30:43.207 "clone": false, 00:30:43.207 "esnap_clone": false 00:30:43.207 } 00:30:43.207 } 00:30:43.207 } 00:30:43.207 ] 00:30:43.207 07:36:15 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:43.207 07:36:15 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:43.207 07:36:15 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:43.207 [2024-07-25 07:36:15.722944] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:43.207 COMP_lvs0/lv0 00:30:43.207 07:36:15 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:43.207 07:36:15 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:30:43.208 07:36:15 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:43.208 07:36:15 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:43.466 07:36:15 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:43.466 07:36:15 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:43.467 07:36:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:43.467 07:36:15 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:43.725 [ 00:30:43.725 { 00:30:43.725 "name": "COMP_lvs0/lv0", 00:30:43.725 "aliases": [ 00:30:43.725 "dec5f92f-3a3d-59a2-8222-550aa73af81b" 00:30:43.725 ], 00:30:43.725 "product_name": "compress", 00:30:43.725 "block_size": 4096, 00:30:43.725 "num_blocks": 25088, 00:30:43.725 "uuid": "dec5f92f-3a3d-59a2-8222-550aa73af81b", 00:30:43.725 "assigned_rate_limits": { 00:30:43.725 "rw_ios_per_sec": 0, 00:30:43.725 "rw_mbytes_per_sec": 0, 00:30:43.725 "r_mbytes_per_sec": 0, 00:30:43.725 "w_mbytes_per_sec": 0 00:30:43.725 }, 00:30:43.725 "claimed": false, 00:30:43.725 "zoned": false, 00:30:43.725 "supported_io_types": { 00:30:43.725 "read": true, 00:30:43.725 "write": true, 00:30:43.725 "unmap": false, 00:30:43.726 "flush": false, 00:30:43.726 "reset": false, 00:30:43.726 "nvme_admin": false, 00:30:43.726 "nvme_io": false, 00:30:43.726 "nvme_io_md": false, 00:30:43.726 "write_zeroes": true, 00:30:43.726 "zcopy": false, 00:30:43.726 "get_zone_info": false, 00:30:43.726 "zone_management": false, 00:30:43.726 "zone_append": false, 00:30:43.726 "compare": false, 00:30:43.726 "compare_and_write": false, 00:30:43.726 "abort": false, 00:30:43.726 "seek_hole": false, 00:30:43.726 "seek_data": false, 00:30:43.726 "copy": false, 00:30:43.726 "nvme_iov_md": false 00:30:43.726 }, 00:30:43.726 "driver_specific": { 00:30:43.726 "compress": { 00:30:43.726 "name": "COMP_lvs0/lv0", 00:30:43.726 "base_bdev_name": "1d669351-0609-4cd9-a7f7-6554d40e5842", 00:30:43.726 "pm_path": "/tmp/pmem/3a4a07a1-cc59-4201-94ae-d25e2c4ac58a" 00:30:43.726 } 00:30:43.726 } 00:30:43.726 } 00:30:43.726 ] 00:30:43.726 07:36:16 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:43.726 07:36:16 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:43.985 [2024-07-25 07:36:16.293245] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f77141b15c0 PMD being used: compress_qat 00:30:43.985 [2024-07-25 07:36:16.295268] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fd98c0 PMD being used: compress_qat 00:30:43.985 Running I/O for 3 seconds... 00:30:47.274 00:30:47.274 Latency(us) 00:30:47.274 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:47.275 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:47.275 Verification LBA range: start 0x0 length 0x3100 00:30:47.275 COMP_lvs0/lv0 : 3.00 3916.57 15.30 0.00 0.00 8123.30 176.13 15938.36 00:30:47.275 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:47.275 Verification LBA range: start 0x3100 length 0x3100 00:30:47.275 COMP_lvs0/lv0 : 3.00 3981.92 15.55 0.00 0.00 7999.08 168.76 15309.21 00:30:47.275 =================================================================================================================== 00:30:47.275 Total : 7898.49 30.85 0.00 0.00 8060.67 168.76 15938.36 00:30:47.275 0 00:30:47.275 07:36:19 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:47.275 07:36:19 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:47.275 07:36:19 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:47.275 07:36:19 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:47.275 07:36:19 compress_compdev -- compress/compress.sh@78 -- # killprocess 1792133 00:30:47.275 07:36:19 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1792133 ']' 00:30:47.275 07:36:19 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1792133 00:30:47.275 07:36:19 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:30:47.275 07:36:19 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:47.275 07:36:19 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1792133 00:30:47.534 07:36:19 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:30:47.534 07:36:19 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:30:47.534 07:36:19 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1792133' 00:30:47.534 killing process with pid 1792133 00:30:47.534 07:36:19 compress_compdev -- common/autotest_common.sh@969 -- # kill 1792133 00:30:47.534 Received shutdown signal, test time was about 3.000000 seconds 00:30:47.534 00:30:47.534 Latency(us) 00:30:47.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:47.534 =================================================================================================================== 00:30:47.534 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:47.534 07:36:19 compress_compdev -- common/autotest_common.sh@974 -- # wait 1792133 00:30:50.067 07:36:22 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:50.067 07:36:22 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:50.067 07:36:22 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1794405 00:30:50.067 07:36:22 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:50.067 07:36:22 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:50.067 07:36:22 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1794405 00:30:50.067 07:36:22 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1794405 ']' 00:30:50.067 07:36:22 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:50.067 07:36:22 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:50.067 07:36:22 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:50.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:50.067 07:36:22 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:50.067 07:36:22 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:50.067 [2024-07-25 07:36:22.378085] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:30:50.067 [2024-07-25 07:36:22.378154] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1794405 ] 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:50.067 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:50.067 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:50.067 [2024-07-25 07:36:22.509739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:50.067 [2024-07-25 07:36:22.595019] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:30:50.067 [2024-07-25 07:36:22.595113] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:30:50.067 [2024-07-25 07:36:22.595117] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:30:51.001 [2024-07-25 07:36:23.271842] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:51.001 07:36:23 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:51.001 07:36:23 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:30:51.001 07:36:23 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:51.001 07:36:23 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:51.001 07:36:23 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:54.286 [2024-07-25 07:36:26.424720] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x189dba0 PMD being used: compress_qat 00:30:54.286 07:36:26 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:54.286 07:36:26 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:54.545 [ 00:30:54.545 { 00:30:54.545 "name": "Nvme0n1", 00:30:54.545 "aliases": [ 00:30:54.545 "07594d26-f83d-4946-8779-ec68407be77b" 00:30:54.545 ], 00:30:54.545 "product_name": "NVMe disk", 00:30:54.545 "block_size": 512, 00:30:54.545 "num_blocks": 3907029168, 00:30:54.545 "uuid": "07594d26-f83d-4946-8779-ec68407be77b", 00:30:54.545 "assigned_rate_limits": { 00:30:54.545 "rw_ios_per_sec": 0, 00:30:54.545 "rw_mbytes_per_sec": 0, 00:30:54.545 "r_mbytes_per_sec": 0, 00:30:54.545 "w_mbytes_per_sec": 0 00:30:54.545 }, 00:30:54.545 "claimed": false, 00:30:54.545 "zoned": false, 00:30:54.545 "supported_io_types": { 00:30:54.545 "read": true, 00:30:54.545 "write": true, 00:30:54.545 "unmap": true, 00:30:54.545 "flush": true, 00:30:54.545 "reset": true, 00:30:54.545 "nvme_admin": true, 00:30:54.545 "nvme_io": true, 00:30:54.545 "nvme_io_md": false, 00:30:54.545 "write_zeroes": true, 00:30:54.545 "zcopy": false, 00:30:54.545 "get_zone_info": false, 00:30:54.545 "zone_management": false, 00:30:54.545 "zone_append": false, 00:30:54.545 "compare": false, 00:30:54.545 "compare_and_write": false, 00:30:54.545 "abort": true, 00:30:54.545 "seek_hole": false, 00:30:54.545 "seek_data": false, 00:30:54.545 "copy": false, 00:30:54.545 "nvme_iov_md": false 00:30:54.545 }, 00:30:54.545 "driver_specific": { 00:30:54.545 "nvme": [ 00:30:54.545 { 00:30:54.545 "pci_address": "0000:d8:00.0", 00:30:54.545 "trid": { 00:30:54.545 "trtype": "PCIe", 00:30:54.545 "traddr": "0000:d8:00.0" 00:30:54.545 }, 00:30:54.545 "ctrlr_data": { 00:30:54.545 "cntlid": 0, 00:30:54.545 "vendor_id": "0x8086", 00:30:54.545 "model_number": "INTEL SSDPE2KX020T8", 00:30:54.545 "serial_number": "BTLJ125505KA2P0BGN", 00:30:54.545 "firmware_revision": "VDV10170", 00:30:54.545 "oacs": { 00:30:54.545 "security": 0, 00:30:54.545 "format": 1, 00:30:54.545 "firmware": 1, 00:30:54.545 "ns_manage": 1 00:30:54.545 }, 00:30:54.545 "multi_ctrlr": false, 00:30:54.545 "ana_reporting": false 00:30:54.545 }, 00:30:54.545 "vs": { 00:30:54.545 "nvme_version": "1.2" 00:30:54.545 }, 00:30:54.545 "ns_data": { 00:30:54.545 "id": 1, 00:30:54.545 "can_share": false 00:30:54.545 } 00:30:54.545 } 00:30:54.545 ], 00:30:54.545 "mp_policy": "active_passive" 00:30:54.545 } 00:30:54.545 } 00:30:54.545 ] 00:30:54.545 07:36:26 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:54.545 07:36:26 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:54.803 [2024-07-25 07:36:27.122356] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x189db40 PMD being used: compress_qat 00:30:55.736 c55871c2-066e-43d6-9903-b2fad2369887 00:30:55.736 07:36:28 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:55.994 0261eafd-1815-4d2d-9c1b-f5d64adb6614 00:30:55.994 07:36:28 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:55.994 07:36:28 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:30:55.994 07:36:28 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:55.994 07:36:28 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:55.994 07:36:28 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:55.994 07:36:28 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:55.994 07:36:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.252 07:36:28 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:56.252 [ 00:30:56.252 { 00:30:56.252 "name": "0261eafd-1815-4d2d-9c1b-f5d64adb6614", 00:30:56.252 "aliases": [ 00:30:56.252 "lvs0/lv0" 00:30:56.252 ], 00:30:56.252 "product_name": "Logical Volume", 00:30:56.252 "block_size": 512, 00:30:56.252 "num_blocks": 204800, 00:30:56.252 "uuid": "0261eafd-1815-4d2d-9c1b-f5d64adb6614", 00:30:56.252 "assigned_rate_limits": { 00:30:56.252 "rw_ios_per_sec": 0, 00:30:56.252 "rw_mbytes_per_sec": 0, 00:30:56.252 "r_mbytes_per_sec": 0, 00:30:56.252 "w_mbytes_per_sec": 0 00:30:56.252 }, 00:30:56.252 "claimed": false, 00:30:56.252 "zoned": false, 00:30:56.252 "supported_io_types": { 00:30:56.252 "read": true, 00:30:56.252 "write": true, 00:30:56.252 "unmap": true, 00:30:56.252 "flush": false, 00:30:56.252 "reset": true, 00:30:56.252 "nvme_admin": false, 00:30:56.252 "nvme_io": false, 00:30:56.252 "nvme_io_md": false, 00:30:56.252 "write_zeroes": true, 00:30:56.252 "zcopy": false, 00:30:56.252 "get_zone_info": false, 00:30:56.252 "zone_management": false, 00:30:56.252 "zone_append": false, 00:30:56.252 "compare": false, 00:30:56.252 "compare_and_write": false, 00:30:56.252 "abort": false, 00:30:56.252 "seek_hole": true, 00:30:56.252 "seek_data": true, 00:30:56.252 "copy": false, 00:30:56.252 "nvme_iov_md": false 00:30:56.252 }, 00:30:56.252 "driver_specific": { 00:30:56.252 "lvol": { 00:30:56.252 "lvol_store_uuid": "c55871c2-066e-43d6-9903-b2fad2369887", 00:30:56.252 "base_bdev": "Nvme0n1", 00:30:56.252 "thin_provision": true, 00:30:56.252 "num_allocated_clusters": 0, 00:30:56.252 "snapshot": false, 00:30:56.252 "clone": false, 00:30:56.252 "esnap_clone": false 00:30:56.252 } 00:30:56.252 } 00:30:56.252 } 00:30:56.252 ] 00:30:56.510 07:36:28 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:56.510 07:36:28 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:56.510 07:36:28 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:56.511 [2024-07-25 07:36:29.011164] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:56.511 COMP_lvs0/lv0 00:30:56.511 07:36:29 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:56.511 07:36:29 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:30:56.511 07:36:29 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:56.511 07:36:29 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:30:56.511 07:36:29 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:56.511 07:36:29 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:56.511 07:36:29 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.768 07:36:29 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:57.026 [ 00:30:57.026 { 00:30:57.026 "name": "COMP_lvs0/lv0", 00:30:57.026 "aliases": [ 00:30:57.026 "8844f71b-85d2-54b7-8e78-6f19b3391062" 00:30:57.026 ], 00:30:57.026 "product_name": "compress", 00:30:57.026 "block_size": 512, 00:30:57.026 "num_blocks": 200704, 00:30:57.026 "uuid": "8844f71b-85d2-54b7-8e78-6f19b3391062", 00:30:57.026 "assigned_rate_limits": { 00:30:57.026 "rw_ios_per_sec": 0, 00:30:57.026 "rw_mbytes_per_sec": 0, 00:30:57.026 "r_mbytes_per_sec": 0, 00:30:57.026 "w_mbytes_per_sec": 0 00:30:57.026 }, 00:30:57.026 "claimed": false, 00:30:57.026 "zoned": false, 00:30:57.026 "supported_io_types": { 00:30:57.026 "read": true, 00:30:57.026 "write": true, 00:30:57.026 "unmap": false, 00:30:57.026 "flush": false, 00:30:57.026 "reset": false, 00:30:57.026 "nvme_admin": false, 00:30:57.026 "nvme_io": false, 00:30:57.026 "nvme_io_md": false, 00:30:57.026 "write_zeroes": true, 00:30:57.026 "zcopy": false, 00:30:57.026 "get_zone_info": false, 00:30:57.026 "zone_management": false, 00:30:57.026 "zone_append": false, 00:30:57.026 "compare": false, 00:30:57.026 "compare_and_write": false, 00:30:57.026 "abort": false, 00:30:57.026 "seek_hole": false, 00:30:57.026 "seek_data": false, 00:30:57.026 "copy": false, 00:30:57.026 "nvme_iov_md": false 00:30:57.026 }, 00:30:57.026 "driver_specific": { 00:30:57.026 "compress": { 00:30:57.026 "name": "COMP_lvs0/lv0", 00:30:57.026 "base_bdev_name": "0261eafd-1815-4d2d-9c1b-f5d64adb6614", 00:30:57.026 "pm_path": "/tmp/pmem/b5f892b6-50d4-4af4-baf9-8b655ae02df8" 00:30:57.026 } 00:30:57.026 } 00:30:57.026 } 00:30:57.026 ] 00:30:57.026 07:36:29 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:30:57.026 07:36:29 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:57.284 [2024-07-25 07:36:29.588061] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fefa41b1350 PMD being used: compress_qat 00:30:57.284 I/O targets: 00:30:57.284 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:57.284 00:30:57.284 00:30:57.284 CUnit - A unit testing framework for C - Version 2.1-3 00:30:57.284 http://cunit.sourceforge.net/ 00:30:57.284 00:30:57.284 00:30:57.284 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:57.284 Test: blockdev write read block ...passed 00:30:57.284 Test: blockdev write zeroes read block ...passed 00:30:57.284 Test: blockdev write zeroes read no split ...passed 00:30:57.284 Test: blockdev write zeroes read split ...passed 00:30:57.284 Test: blockdev write zeroes read split partial ...passed 00:30:57.284 Test: blockdev reset ...[2024-07-25 07:36:29.654570] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:57.284 passed 00:30:57.284 Test: blockdev write read 8 blocks ...passed 00:30:57.284 Test: blockdev write read size > 128k ...passed 00:30:57.284 Test: blockdev write read invalid size ...passed 00:30:57.284 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:57.284 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:57.284 Test: blockdev write read max offset ...passed 00:30:57.284 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:57.284 Test: blockdev writev readv 8 blocks ...passed 00:30:57.284 Test: blockdev writev readv 30 x 1block ...passed 00:30:57.284 Test: blockdev writev readv block ...passed 00:30:57.284 Test: blockdev writev readv size > 128k ...passed 00:30:57.284 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:57.284 Test: blockdev comparev and writev ...passed 00:30:57.284 Test: blockdev nvme passthru rw ...passed 00:30:57.284 Test: blockdev nvme passthru vendor specific ...passed 00:30:57.284 Test: blockdev nvme admin passthru ...passed 00:30:57.284 Test: blockdev copy ...passed 00:30:57.284 00:30:57.284 Run Summary: Type Total Ran Passed Failed Inactive 00:30:57.284 suites 1 1 n/a 0 0 00:30:57.284 tests 23 23 23 0 0 00:30:57.284 asserts 130 130 130 0 n/a 00:30:57.284 00:30:57.284 Elapsed time = 0.208 seconds 00:30:57.284 0 00:30:57.284 07:36:29 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:30:57.284 07:36:29 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:57.542 07:36:29 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:57.800 07:36:30 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:57.800 07:36:30 compress_compdev -- compress/compress.sh@62 -- # killprocess 1794405 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1794405 ']' 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1794405 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1794405 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1794405' 00:30:57.800 killing process with pid 1794405 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@969 -- # kill 1794405 00:30:57.800 07:36:30 compress_compdev -- common/autotest_common.sh@974 -- # wait 1794405 00:31:00.334 07:36:32 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:00.334 07:36:32 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:00.334 00:31:00.334 real 0m50.160s 00:31:00.334 user 1m53.269s 00:31:00.334 sys 0m5.553s 00:31:00.334 07:36:32 compress_compdev -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:00.334 07:36:32 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:00.334 ************************************ 00:31:00.334 END TEST compress_compdev 00:31:00.334 ************************************ 00:31:00.334 07:36:32 -- spdk/autotest.sh@353 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:00.334 07:36:32 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:00.334 07:36:32 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:00.334 07:36:32 -- common/autotest_common.sh@10 -- # set +x 00:31:00.334 ************************************ 00:31:00.334 START TEST compress_isal 00:31:00.334 ************************************ 00:31:00.334 07:36:32 compress_isal -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:00.592 * Looking for test storage... 00:31:00.592 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:00.592 07:36:32 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:00.592 07:36:32 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:00.593 07:36:32 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:00.593 07:36:32 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:00.593 07:36:32 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:00.593 07:36:32 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.593 07:36:32 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.593 07:36:32 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.593 07:36:32 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:00.593 07:36:32 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:00.593 07:36:32 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1796193 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:00.593 07:36:32 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1796193 00:31:00.593 07:36:32 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1796193 ']' 00:31:00.593 07:36:32 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:00.593 07:36:32 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:00.593 07:36:32 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:00.593 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:00.593 07:36:32 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:00.593 07:36:32 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:00.593 [2024-07-25 07:36:33.003446] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:00.593 [2024-07-25 07:36:33.003507] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1796193 ] 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:00.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:00.593 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:00.593 [2024-07-25 07:36:33.123959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:00.851 [2024-07-25 07:36:33.207348] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:31:00.851 [2024-07-25 07:36:33.207355] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:31:01.415 07:36:33 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:01.415 07:36:33 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:31:01.415 07:36:33 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:01.415 07:36:33 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:01.415 07:36:33 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:04.696 07:36:37 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:04.696 07:36:37 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:04.696 07:36:37 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:04.696 07:36:37 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:04.696 07:36:37 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:04.696 07:36:37 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:04.696 07:36:37 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:04.955 07:36:37 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:04.955 [ 00:31:04.955 { 00:31:04.955 "name": "Nvme0n1", 00:31:04.955 "aliases": [ 00:31:04.955 "9850f6c6-e92c-4ca0-8bcb-9aa8717c0d8e" 00:31:04.955 ], 00:31:04.955 "product_name": "NVMe disk", 00:31:04.955 "block_size": 512, 00:31:04.955 "num_blocks": 3907029168, 00:31:04.955 "uuid": "9850f6c6-e92c-4ca0-8bcb-9aa8717c0d8e", 00:31:04.955 "assigned_rate_limits": { 00:31:04.955 "rw_ios_per_sec": 0, 00:31:04.955 "rw_mbytes_per_sec": 0, 00:31:04.955 "r_mbytes_per_sec": 0, 00:31:04.955 "w_mbytes_per_sec": 0 00:31:04.955 }, 00:31:04.955 "claimed": false, 00:31:04.955 "zoned": false, 00:31:04.955 "supported_io_types": { 00:31:04.955 "read": true, 00:31:04.955 "write": true, 00:31:04.955 "unmap": true, 00:31:04.955 "flush": true, 00:31:04.955 "reset": true, 00:31:04.955 "nvme_admin": true, 00:31:04.955 "nvme_io": true, 00:31:04.955 "nvme_io_md": false, 00:31:04.955 "write_zeroes": true, 00:31:04.955 "zcopy": false, 00:31:04.955 "get_zone_info": false, 00:31:04.955 "zone_management": false, 00:31:04.955 "zone_append": false, 00:31:04.955 "compare": false, 00:31:04.955 "compare_and_write": false, 00:31:04.955 "abort": true, 00:31:04.955 "seek_hole": false, 00:31:04.955 "seek_data": false, 00:31:04.955 "copy": false, 00:31:04.955 "nvme_iov_md": false 00:31:04.955 }, 00:31:04.955 "driver_specific": { 00:31:04.955 "nvme": [ 00:31:04.955 { 00:31:04.955 "pci_address": "0000:d8:00.0", 00:31:04.955 "trid": { 00:31:04.955 "trtype": "PCIe", 00:31:04.955 "traddr": "0000:d8:00.0" 00:31:04.955 }, 00:31:04.955 "ctrlr_data": { 00:31:04.955 "cntlid": 0, 00:31:04.955 "vendor_id": "0x8086", 00:31:04.955 "model_number": "INTEL SSDPE2KX020T8", 00:31:04.955 "serial_number": "BTLJ125505KA2P0BGN", 00:31:04.955 "firmware_revision": "VDV10170", 00:31:04.955 "oacs": { 00:31:04.955 "security": 0, 00:31:04.955 "format": 1, 00:31:04.955 "firmware": 1, 00:31:04.955 "ns_manage": 1 00:31:04.955 }, 00:31:04.955 "multi_ctrlr": false, 00:31:04.955 "ana_reporting": false 00:31:04.955 }, 00:31:04.955 "vs": { 00:31:04.955 "nvme_version": "1.2" 00:31:04.955 }, 00:31:04.955 "ns_data": { 00:31:04.955 "id": 1, 00:31:04.955 "can_share": false 00:31:04.955 } 00:31:04.955 } 00:31:04.955 ], 00:31:04.955 "mp_policy": "active_passive" 00:31:04.955 } 00:31:04.955 } 00:31:04.955 ] 00:31:05.215 07:36:37 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:05.215 07:36:37 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:06.593 850d193d-24ae-4633-b034-5bb206a87e6c 00:31:06.593 07:36:38 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:06.593 6eb5705d-d467-46ad-b2eb-e6f709e721a6 00:31:06.593 07:36:38 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:06.593 07:36:38 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:06.593 07:36:38 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:06.593 07:36:38 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:06.593 07:36:38 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:06.593 07:36:38 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:06.593 07:36:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:06.593 07:36:39 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:06.853 [ 00:31:06.853 { 00:31:06.853 "name": "6eb5705d-d467-46ad-b2eb-e6f709e721a6", 00:31:06.853 "aliases": [ 00:31:06.853 "lvs0/lv0" 00:31:06.853 ], 00:31:06.853 "product_name": "Logical Volume", 00:31:06.853 "block_size": 512, 00:31:06.853 "num_blocks": 204800, 00:31:06.853 "uuid": "6eb5705d-d467-46ad-b2eb-e6f709e721a6", 00:31:06.853 "assigned_rate_limits": { 00:31:06.853 "rw_ios_per_sec": 0, 00:31:06.853 "rw_mbytes_per_sec": 0, 00:31:06.853 "r_mbytes_per_sec": 0, 00:31:06.853 "w_mbytes_per_sec": 0 00:31:06.853 }, 00:31:06.853 "claimed": false, 00:31:06.853 "zoned": false, 00:31:06.853 "supported_io_types": { 00:31:06.853 "read": true, 00:31:06.853 "write": true, 00:31:06.853 "unmap": true, 00:31:06.853 "flush": false, 00:31:06.853 "reset": true, 00:31:06.853 "nvme_admin": false, 00:31:06.853 "nvme_io": false, 00:31:06.853 "nvme_io_md": false, 00:31:06.853 "write_zeroes": true, 00:31:06.853 "zcopy": false, 00:31:06.853 "get_zone_info": false, 00:31:06.853 "zone_management": false, 00:31:06.853 "zone_append": false, 00:31:06.853 "compare": false, 00:31:06.853 "compare_and_write": false, 00:31:06.853 "abort": false, 00:31:06.853 "seek_hole": true, 00:31:06.853 "seek_data": true, 00:31:06.853 "copy": false, 00:31:06.853 "nvme_iov_md": false 00:31:06.853 }, 00:31:06.853 "driver_specific": { 00:31:06.853 "lvol": { 00:31:06.853 "lvol_store_uuid": "850d193d-24ae-4633-b034-5bb206a87e6c", 00:31:06.853 "base_bdev": "Nvme0n1", 00:31:06.853 "thin_provision": true, 00:31:06.853 "num_allocated_clusters": 0, 00:31:06.853 "snapshot": false, 00:31:06.853 "clone": false, 00:31:06.853 "esnap_clone": false 00:31:06.853 } 00:31:06.853 } 00:31:06.853 } 00:31:06.853 ] 00:31:06.853 07:36:39 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:06.853 07:36:39 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:06.853 07:36:39 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:07.112 [2024-07-25 07:36:39.552064] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:07.112 COMP_lvs0/lv0 00:31:07.112 07:36:39 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:07.112 07:36:39 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:07.112 07:36:39 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:07.112 07:36:39 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:07.112 07:36:39 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:07.112 07:36:39 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:07.112 07:36:39 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:07.373 07:36:39 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:07.662 [ 00:31:07.662 { 00:31:07.662 "name": "COMP_lvs0/lv0", 00:31:07.662 "aliases": [ 00:31:07.662 "e714528b-ab80-5810-afa2-d72a5688080e" 00:31:07.662 ], 00:31:07.662 "product_name": "compress", 00:31:07.662 "block_size": 512, 00:31:07.662 "num_blocks": 200704, 00:31:07.662 "uuid": "e714528b-ab80-5810-afa2-d72a5688080e", 00:31:07.662 "assigned_rate_limits": { 00:31:07.662 "rw_ios_per_sec": 0, 00:31:07.662 "rw_mbytes_per_sec": 0, 00:31:07.662 "r_mbytes_per_sec": 0, 00:31:07.662 "w_mbytes_per_sec": 0 00:31:07.662 }, 00:31:07.662 "claimed": false, 00:31:07.662 "zoned": false, 00:31:07.662 "supported_io_types": { 00:31:07.662 "read": true, 00:31:07.662 "write": true, 00:31:07.662 "unmap": false, 00:31:07.662 "flush": false, 00:31:07.662 "reset": false, 00:31:07.662 "nvme_admin": false, 00:31:07.662 "nvme_io": false, 00:31:07.662 "nvme_io_md": false, 00:31:07.662 "write_zeroes": true, 00:31:07.662 "zcopy": false, 00:31:07.662 "get_zone_info": false, 00:31:07.662 "zone_management": false, 00:31:07.662 "zone_append": false, 00:31:07.662 "compare": false, 00:31:07.662 "compare_and_write": false, 00:31:07.662 "abort": false, 00:31:07.662 "seek_hole": false, 00:31:07.662 "seek_data": false, 00:31:07.662 "copy": false, 00:31:07.662 "nvme_iov_md": false 00:31:07.662 }, 00:31:07.662 "driver_specific": { 00:31:07.662 "compress": { 00:31:07.662 "name": "COMP_lvs0/lv0", 00:31:07.662 "base_bdev_name": "6eb5705d-d467-46ad-b2eb-e6f709e721a6", 00:31:07.662 "pm_path": "/tmp/pmem/8e62ef84-7498-415e-b743-5585ec9a2e9e" 00:31:07.662 } 00:31:07.662 } 00:31:07.662 } 00:31:07.662 ] 00:31:07.662 07:36:40 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:07.662 07:36:40 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:07.662 Running I/O for 3 seconds... 00:31:10.952 00:31:10.952 Latency(us) 00:31:10.952 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:10.952 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:10.952 Verification LBA range: start 0x0 length 0x3100 00:31:10.952 COMP_lvs0/lv0 : 3.01 3429.55 13.40 0.00 0.00 9274.75 58.57 17196.65 00:31:10.952 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:10.952 Verification LBA range: start 0x3100 length 0x3100 00:31:10.952 COMP_lvs0/lv0 : 3.01 3442.63 13.45 0.00 0.00 9247.98 56.93 16567.50 00:31:10.952 =================================================================================================================== 00:31:10.952 Total : 6872.18 26.84 0.00 0.00 9261.34 56.93 17196.65 00:31:10.952 0 00:31:10.952 07:36:43 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:10.952 07:36:43 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:10.952 07:36:43 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:11.211 07:36:43 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:11.211 07:36:43 compress_isal -- compress/compress.sh@78 -- # killprocess 1796193 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1796193 ']' 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1796193 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@955 -- # uname 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1796193 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1796193' 00:31:11.211 killing process with pid 1796193 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@969 -- # kill 1796193 00:31:11.211 Received shutdown signal, test time was about 3.000000 seconds 00:31:11.211 00:31:11.211 Latency(us) 00:31:11.211 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:11.211 =================================================================================================================== 00:31:11.211 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:11.211 07:36:43 compress_isal -- common/autotest_common.sh@974 -- # wait 1796193 00:31:13.746 07:36:46 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:13.746 07:36:46 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:13.746 07:36:46 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1798325 00:31:13.746 07:36:46 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:13.746 07:36:46 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:13.746 07:36:46 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1798325 00:31:13.746 07:36:46 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1798325 ']' 00:31:13.746 07:36:46 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:13.746 07:36:46 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:13.746 07:36:46 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:13.746 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:13.746 07:36:46 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:13.746 07:36:46 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:13.746 [2024-07-25 07:36:46.203977] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:13.746 [2024-07-25 07:36:46.204041] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1798325 ] 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:13.746 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:13.746 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:14.006 [2024-07-25 07:36:46.323953] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:14.006 [2024-07-25 07:36:46.410970] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:31:14.006 [2024-07-25 07:36:46.410977] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:31:14.574 07:36:47 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:14.574 07:36:47 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:31:14.833 07:36:47 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:14.833 07:36:47 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:14.833 07:36:47 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:18.123 07:36:50 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:18.123 07:36:50 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:18.123 [ 00:31:18.123 { 00:31:18.123 "name": "Nvme0n1", 00:31:18.123 "aliases": [ 00:31:18.123 "d704b86c-2b34-48a7-bed5-9c997aa1cd83" 00:31:18.123 ], 00:31:18.123 "product_name": "NVMe disk", 00:31:18.123 "block_size": 512, 00:31:18.123 "num_blocks": 3907029168, 00:31:18.123 "uuid": "d704b86c-2b34-48a7-bed5-9c997aa1cd83", 00:31:18.123 "assigned_rate_limits": { 00:31:18.123 "rw_ios_per_sec": 0, 00:31:18.123 "rw_mbytes_per_sec": 0, 00:31:18.123 "r_mbytes_per_sec": 0, 00:31:18.123 "w_mbytes_per_sec": 0 00:31:18.123 }, 00:31:18.123 "claimed": false, 00:31:18.123 "zoned": false, 00:31:18.124 "supported_io_types": { 00:31:18.124 "read": true, 00:31:18.124 "write": true, 00:31:18.124 "unmap": true, 00:31:18.124 "flush": true, 00:31:18.124 "reset": true, 00:31:18.124 "nvme_admin": true, 00:31:18.124 "nvme_io": true, 00:31:18.124 "nvme_io_md": false, 00:31:18.124 "write_zeroes": true, 00:31:18.124 "zcopy": false, 00:31:18.124 "get_zone_info": false, 00:31:18.124 "zone_management": false, 00:31:18.124 "zone_append": false, 00:31:18.124 "compare": false, 00:31:18.124 "compare_and_write": false, 00:31:18.124 "abort": true, 00:31:18.124 "seek_hole": false, 00:31:18.124 "seek_data": false, 00:31:18.124 "copy": false, 00:31:18.124 "nvme_iov_md": false 00:31:18.124 }, 00:31:18.124 "driver_specific": { 00:31:18.124 "nvme": [ 00:31:18.124 { 00:31:18.124 "pci_address": "0000:d8:00.0", 00:31:18.124 "trid": { 00:31:18.124 "trtype": "PCIe", 00:31:18.124 "traddr": "0000:d8:00.0" 00:31:18.124 }, 00:31:18.124 "ctrlr_data": { 00:31:18.124 "cntlid": 0, 00:31:18.124 "vendor_id": "0x8086", 00:31:18.124 "model_number": "INTEL SSDPE2KX020T8", 00:31:18.124 "serial_number": "BTLJ125505KA2P0BGN", 00:31:18.124 "firmware_revision": "VDV10170", 00:31:18.124 "oacs": { 00:31:18.124 "security": 0, 00:31:18.124 "format": 1, 00:31:18.124 "firmware": 1, 00:31:18.124 "ns_manage": 1 00:31:18.124 }, 00:31:18.124 "multi_ctrlr": false, 00:31:18.124 "ana_reporting": false 00:31:18.124 }, 00:31:18.124 "vs": { 00:31:18.124 "nvme_version": "1.2" 00:31:18.124 }, 00:31:18.124 "ns_data": { 00:31:18.124 "id": 1, 00:31:18.124 "can_share": false 00:31:18.124 } 00:31:18.124 } 00:31:18.124 ], 00:31:18.124 "mp_policy": "active_passive" 00:31:18.124 } 00:31:18.124 } 00:31:18.124 ] 00:31:18.124 07:36:50 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:18.124 07:36:50 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:19.503 d1b3b20f-82f8-446c-b662-ea298efb2eb7 00:31:19.503 07:36:51 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:19.762 cfda57d0-9301-465c-9023-d84781880831 00:31:19.762 07:36:52 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:19.762 07:36:52 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:19.762 07:36:52 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:19.762 07:36:52 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:19.762 07:36:52 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:19.762 07:36:52 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:19.762 07:36:52 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:20.021 07:36:52 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:20.021 [ 00:31:20.021 { 00:31:20.021 "name": "cfda57d0-9301-465c-9023-d84781880831", 00:31:20.021 "aliases": [ 00:31:20.021 "lvs0/lv0" 00:31:20.021 ], 00:31:20.021 "product_name": "Logical Volume", 00:31:20.021 "block_size": 512, 00:31:20.021 "num_blocks": 204800, 00:31:20.021 "uuid": "cfda57d0-9301-465c-9023-d84781880831", 00:31:20.021 "assigned_rate_limits": { 00:31:20.021 "rw_ios_per_sec": 0, 00:31:20.021 "rw_mbytes_per_sec": 0, 00:31:20.021 "r_mbytes_per_sec": 0, 00:31:20.021 "w_mbytes_per_sec": 0 00:31:20.021 }, 00:31:20.021 "claimed": false, 00:31:20.021 "zoned": false, 00:31:20.021 "supported_io_types": { 00:31:20.021 "read": true, 00:31:20.021 "write": true, 00:31:20.021 "unmap": true, 00:31:20.021 "flush": false, 00:31:20.021 "reset": true, 00:31:20.021 "nvme_admin": false, 00:31:20.021 "nvme_io": false, 00:31:20.021 "nvme_io_md": false, 00:31:20.021 "write_zeroes": true, 00:31:20.021 "zcopy": false, 00:31:20.021 "get_zone_info": false, 00:31:20.021 "zone_management": false, 00:31:20.021 "zone_append": false, 00:31:20.021 "compare": false, 00:31:20.021 "compare_and_write": false, 00:31:20.021 "abort": false, 00:31:20.021 "seek_hole": true, 00:31:20.021 "seek_data": true, 00:31:20.021 "copy": false, 00:31:20.021 "nvme_iov_md": false 00:31:20.021 }, 00:31:20.021 "driver_specific": { 00:31:20.021 "lvol": { 00:31:20.021 "lvol_store_uuid": "d1b3b20f-82f8-446c-b662-ea298efb2eb7", 00:31:20.021 "base_bdev": "Nvme0n1", 00:31:20.021 "thin_provision": true, 00:31:20.021 "num_allocated_clusters": 0, 00:31:20.021 "snapshot": false, 00:31:20.021 "clone": false, 00:31:20.021 "esnap_clone": false 00:31:20.021 } 00:31:20.021 } 00:31:20.021 } 00:31:20.021 ] 00:31:20.021 07:36:52 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:20.021 07:36:52 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:20.021 07:36:52 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:20.281 [2024-07-25 07:36:52.741703] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:20.281 COMP_lvs0/lv0 00:31:20.281 07:36:52 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:20.281 07:36:52 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:20.281 07:36:52 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:20.281 07:36:52 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:20.281 07:36:52 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:20.281 07:36:52 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:20.281 07:36:52 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:20.540 07:36:52 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:20.799 [ 00:31:20.799 { 00:31:20.799 "name": "COMP_lvs0/lv0", 00:31:20.799 "aliases": [ 00:31:20.799 "7bb93b7b-1c87-5148-be24-74bb327963c1" 00:31:20.799 ], 00:31:20.799 "product_name": "compress", 00:31:20.799 "block_size": 512, 00:31:20.799 "num_blocks": 200704, 00:31:20.799 "uuid": "7bb93b7b-1c87-5148-be24-74bb327963c1", 00:31:20.799 "assigned_rate_limits": { 00:31:20.799 "rw_ios_per_sec": 0, 00:31:20.799 "rw_mbytes_per_sec": 0, 00:31:20.799 "r_mbytes_per_sec": 0, 00:31:20.799 "w_mbytes_per_sec": 0 00:31:20.799 }, 00:31:20.799 "claimed": false, 00:31:20.799 "zoned": false, 00:31:20.799 "supported_io_types": { 00:31:20.799 "read": true, 00:31:20.799 "write": true, 00:31:20.799 "unmap": false, 00:31:20.799 "flush": false, 00:31:20.799 "reset": false, 00:31:20.799 "nvme_admin": false, 00:31:20.799 "nvme_io": false, 00:31:20.799 "nvme_io_md": false, 00:31:20.799 "write_zeroes": true, 00:31:20.799 "zcopy": false, 00:31:20.799 "get_zone_info": false, 00:31:20.799 "zone_management": false, 00:31:20.799 "zone_append": false, 00:31:20.799 "compare": false, 00:31:20.799 "compare_and_write": false, 00:31:20.799 "abort": false, 00:31:20.799 "seek_hole": false, 00:31:20.799 "seek_data": false, 00:31:20.799 "copy": false, 00:31:20.799 "nvme_iov_md": false 00:31:20.799 }, 00:31:20.799 "driver_specific": { 00:31:20.799 "compress": { 00:31:20.799 "name": "COMP_lvs0/lv0", 00:31:20.799 "base_bdev_name": "cfda57d0-9301-465c-9023-d84781880831", 00:31:20.799 "pm_path": "/tmp/pmem/3eeda4e0-b05a-4cc7-8351-2ea164438568" 00:31:20.799 } 00:31:20.799 } 00:31:20.799 } 00:31:20.799 ] 00:31:20.799 07:36:53 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:20.799 07:36:53 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:20.799 Running I/O for 3 seconds... 00:31:24.081 00:31:24.081 Latency(us) 00:31:24.081 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:24.081 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:24.081 Verification LBA range: start 0x0 length 0x3100 00:31:24.081 COMP_lvs0/lv0 : 3.01 3485.98 13.62 0.00 0.00 9119.43 57.34 17091.79 00:31:24.081 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:24.081 Verification LBA range: start 0x3100 length 0x3100 00:31:24.081 COMP_lvs0/lv0 : 3.01 3472.28 13.56 0.00 0.00 9169.13 57.34 16462.64 00:31:24.081 =================================================================================================================== 00:31:24.081 Total : 6958.26 27.18 0.00 0.00 9144.23 57.34 17091.79 00:31:24.081 0 00:31:24.081 07:36:56 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:24.081 07:36:56 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:24.081 07:36:56 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:24.339 07:36:56 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:24.339 07:36:56 compress_isal -- compress/compress.sh@78 -- # killprocess 1798325 00:31:24.339 07:36:56 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1798325 ']' 00:31:24.339 07:36:56 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1798325 00:31:24.339 07:36:56 compress_isal -- common/autotest_common.sh@955 -- # uname 00:31:24.339 07:36:56 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:24.339 07:36:56 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1798325 00:31:24.597 07:36:56 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:24.597 07:36:56 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:24.597 07:36:56 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1798325' 00:31:24.597 killing process with pid 1798325 00:31:24.597 07:36:56 compress_isal -- common/autotest_common.sh@969 -- # kill 1798325 00:31:24.597 Received shutdown signal, test time was about 3.000000 seconds 00:31:24.597 00:31:24.597 Latency(us) 00:31:24.597 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:24.597 =================================================================================================================== 00:31:24.597 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:24.597 07:36:56 compress_isal -- common/autotest_common.sh@974 -- # wait 1798325 00:31:27.130 07:36:59 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:27.130 07:36:59 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:27.130 07:36:59 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1800526 00:31:27.130 07:36:59 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:27.130 07:36:59 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:27.130 07:36:59 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1800526 00:31:27.130 07:36:59 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1800526 ']' 00:31:27.130 07:36:59 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:27.130 07:36:59 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:27.130 07:36:59 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:27.130 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:27.130 07:36:59 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:27.130 07:36:59 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:27.130 [2024-07-25 07:36:59.379697] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:27.130 [2024-07-25 07:36:59.379760] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1800526 ] 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.130 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:27.130 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:27.131 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:27.131 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:27.131 [2024-07-25 07:36:59.501812] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:27.131 [2024-07-25 07:36:59.588759] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:31:27.131 [2024-07-25 07:36:59.588765] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:31:28.065 07:37:00 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:28.065 07:37:00 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:31:28.065 07:37:00 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:31:28.065 07:37:00 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:28.065 07:37:00 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:31.354 07:37:03 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:31.354 [ 00:31:31.354 { 00:31:31.354 "name": "Nvme0n1", 00:31:31.354 "aliases": [ 00:31:31.354 "3d78a5aa-128e-463b-b52f-026a85827252" 00:31:31.354 ], 00:31:31.354 "product_name": "NVMe disk", 00:31:31.354 "block_size": 512, 00:31:31.354 "num_blocks": 3907029168, 00:31:31.354 "uuid": "3d78a5aa-128e-463b-b52f-026a85827252", 00:31:31.354 "assigned_rate_limits": { 00:31:31.354 "rw_ios_per_sec": 0, 00:31:31.354 "rw_mbytes_per_sec": 0, 00:31:31.354 "r_mbytes_per_sec": 0, 00:31:31.354 "w_mbytes_per_sec": 0 00:31:31.354 }, 00:31:31.354 "claimed": false, 00:31:31.354 "zoned": false, 00:31:31.354 "supported_io_types": { 00:31:31.354 "read": true, 00:31:31.354 "write": true, 00:31:31.354 "unmap": true, 00:31:31.354 "flush": true, 00:31:31.354 "reset": true, 00:31:31.354 "nvme_admin": true, 00:31:31.354 "nvme_io": true, 00:31:31.354 "nvme_io_md": false, 00:31:31.354 "write_zeroes": true, 00:31:31.354 "zcopy": false, 00:31:31.354 "get_zone_info": false, 00:31:31.354 "zone_management": false, 00:31:31.354 "zone_append": false, 00:31:31.354 "compare": false, 00:31:31.354 "compare_and_write": false, 00:31:31.354 "abort": true, 00:31:31.354 "seek_hole": false, 00:31:31.354 "seek_data": false, 00:31:31.354 "copy": false, 00:31:31.354 "nvme_iov_md": false 00:31:31.354 }, 00:31:31.354 "driver_specific": { 00:31:31.354 "nvme": [ 00:31:31.354 { 00:31:31.354 "pci_address": "0000:d8:00.0", 00:31:31.354 "trid": { 00:31:31.354 "trtype": "PCIe", 00:31:31.354 "traddr": "0000:d8:00.0" 00:31:31.354 }, 00:31:31.354 "ctrlr_data": { 00:31:31.354 "cntlid": 0, 00:31:31.354 "vendor_id": "0x8086", 00:31:31.354 "model_number": "INTEL SSDPE2KX020T8", 00:31:31.354 "serial_number": "BTLJ125505KA2P0BGN", 00:31:31.354 "firmware_revision": "VDV10170", 00:31:31.354 "oacs": { 00:31:31.354 "security": 0, 00:31:31.354 "format": 1, 00:31:31.354 "firmware": 1, 00:31:31.354 "ns_manage": 1 00:31:31.354 }, 00:31:31.354 "multi_ctrlr": false, 00:31:31.354 "ana_reporting": false 00:31:31.354 }, 00:31:31.354 "vs": { 00:31:31.354 "nvme_version": "1.2" 00:31:31.354 }, 00:31:31.354 "ns_data": { 00:31:31.354 "id": 1, 00:31:31.354 "can_share": false 00:31:31.354 } 00:31:31.354 } 00:31:31.354 ], 00:31:31.354 "mp_policy": "active_passive" 00:31:31.354 } 00:31:31.354 } 00:31:31.354 ] 00:31:31.354 07:37:03 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:31.354 07:37:03 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:32.739 bd9e13ce-d2ac-48ae-a860-ed269c8095fe 00:31:32.739 07:37:04 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:32.739 e9a9a194-13e0-4386-b679-7fbab5a2def2 00:31:32.739 07:37:05 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:32.739 07:37:05 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:32.999 [ 00:31:32.999 { 00:31:32.999 "name": "e9a9a194-13e0-4386-b679-7fbab5a2def2", 00:31:32.999 "aliases": [ 00:31:32.999 "lvs0/lv0" 00:31:32.999 ], 00:31:32.999 "product_name": "Logical Volume", 00:31:32.999 "block_size": 512, 00:31:32.999 "num_blocks": 204800, 00:31:32.999 "uuid": "e9a9a194-13e0-4386-b679-7fbab5a2def2", 00:31:32.999 "assigned_rate_limits": { 00:31:32.999 "rw_ios_per_sec": 0, 00:31:32.999 "rw_mbytes_per_sec": 0, 00:31:32.999 "r_mbytes_per_sec": 0, 00:31:32.999 "w_mbytes_per_sec": 0 00:31:32.999 }, 00:31:32.999 "claimed": false, 00:31:32.999 "zoned": false, 00:31:32.999 "supported_io_types": { 00:31:32.999 "read": true, 00:31:32.999 "write": true, 00:31:32.999 "unmap": true, 00:31:32.999 "flush": false, 00:31:32.999 "reset": true, 00:31:32.999 "nvme_admin": false, 00:31:32.999 "nvme_io": false, 00:31:32.999 "nvme_io_md": false, 00:31:32.999 "write_zeroes": true, 00:31:32.999 "zcopy": false, 00:31:32.999 "get_zone_info": false, 00:31:32.999 "zone_management": false, 00:31:32.999 "zone_append": false, 00:31:32.999 "compare": false, 00:31:32.999 "compare_and_write": false, 00:31:32.999 "abort": false, 00:31:32.999 "seek_hole": true, 00:31:32.999 "seek_data": true, 00:31:32.999 "copy": false, 00:31:32.999 "nvme_iov_md": false 00:31:32.999 }, 00:31:32.999 "driver_specific": { 00:31:32.999 "lvol": { 00:31:32.999 "lvol_store_uuid": "bd9e13ce-d2ac-48ae-a860-ed269c8095fe", 00:31:32.999 "base_bdev": "Nvme0n1", 00:31:32.999 "thin_provision": true, 00:31:32.999 "num_allocated_clusters": 0, 00:31:32.999 "snapshot": false, 00:31:32.999 "clone": false, 00:31:32.999 "esnap_clone": false 00:31:32.999 } 00:31:32.999 } 00:31:32.999 } 00:31:32.999 ] 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:32.999 07:37:05 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:32.999 07:37:05 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:32.999 [2024-07-25 07:37:05.505027] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:32.999 COMP_lvs0/lv0 00:31:32.999 07:37:05 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:32.999 07:37:05 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:33.258 07:37:05 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:33.518 [ 00:31:33.518 { 00:31:33.518 "name": "COMP_lvs0/lv0", 00:31:33.518 "aliases": [ 00:31:33.518 "4086ae2c-c62b-5f22-bdbf-d375b6e888cc" 00:31:33.518 ], 00:31:33.518 "product_name": "compress", 00:31:33.518 "block_size": 4096, 00:31:33.518 "num_blocks": 25088, 00:31:33.518 "uuid": "4086ae2c-c62b-5f22-bdbf-d375b6e888cc", 00:31:33.518 "assigned_rate_limits": { 00:31:33.518 "rw_ios_per_sec": 0, 00:31:33.518 "rw_mbytes_per_sec": 0, 00:31:33.518 "r_mbytes_per_sec": 0, 00:31:33.518 "w_mbytes_per_sec": 0 00:31:33.518 }, 00:31:33.518 "claimed": false, 00:31:33.518 "zoned": false, 00:31:33.518 "supported_io_types": { 00:31:33.518 "read": true, 00:31:33.518 "write": true, 00:31:33.518 "unmap": false, 00:31:33.518 "flush": false, 00:31:33.518 "reset": false, 00:31:33.518 "nvme_admin": false, 00:31:33.518 "nvme_io": false, 00:31:33.518 "nvme_io_md": false, 00:31:33.518 "write_zeroes": true, 00:31:33.518 "zcopy": false, 00:31:33.518 "get_zone_info": false, 00:31:33.518 "zone_management": false, 00:31:33.518 "zone_append": false, 00:31:33.518 "compare": false, 00:31:33.518 "compare_and_write": false, 00:31:33.518 "abort": false, 00:31:33.518 "seek_hole": false, 00:31:33.518 "seek_data": false, 00:31:33.518 "copy": false, 00:31:33.518 "nvme_iov_md": false 00:31:33.518 }, 00:31:33.518 "driver_specific": { 00:31:33.518 "compress": { 00:31:33.518 "name": "COMP_lvs0/lv0", 00:31:33.518 "base_bdev_name": "e9a9a194-13e0-4386-b679-7fbab5a2def2", 00:31:33.518 "pm_path": "/tmp/pmem/45fa004a-7907-43a9-9858-b8a62bc1e14f" 00:31:33.518 } 00:31:33.518 } 00:31:33.518 } 00:31:33.518 ] 00:31:33.518 07:37:05 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:33.518 07:37:05 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:33.518 Running I/O for 3 seconds... 00:31:36.808 00:31:36.808 Latency(us) 00:31:36.808 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:36.808 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:36.808 Verification LBA range: start 0x0 length 0x3100 00:31:36.808 COMP_lvs0/lv0 : 3.01 3402.59 13.29 0.00 0.00 9350.96 59.39 15833.50 00:31:36.808 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:36.808 Verification LBA range: start 0x3100 length 0x3100 00:31:36.808 COMP_lvs0/lv0 : 3.00 3422.91 13.37 0.00 0.00 9307.26 55.71 16252.93 00:31:36.808 =================================================================================================================== 00:31:36.808 Total : 6825.50 26.66 0.00 0.00 9329.05 55.71 16252.93 00:31:36.808 0 00:31:36.808 07:37:08 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:36.808 07:37:08 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:36.808 07:37:09 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:37.068 07:37:09 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:37.068 07:37:09 compress_isal -- compress/compress.sh@78 -- # killprocess 1800526 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1800526 ']' 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1800526 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@955 -- # uname 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1800526 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1800526' 00:31:37.068 killing process with pid 1800526 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@969 -- # kill 1800526 00:31:37.068 Received shutdown signal, test time was about 3.000000 seconds 00:31:37.068 00:31:37.068 Latency(us) 00:31:37.068 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:37.068 =================================================================================================================== 00:31:37.068 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:37.068 07:37:09 compress_isal -- common/autotest_common.sh@974 -- # wait 1800526 00:31:39.606 07:37:11 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:39.606 07:37:11 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:39.606 07:37:11 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1802596 00:31:39.606 07:37:11 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:39.606 07:37:11 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:39.606 07:37:11 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1802596 00:31:39.606 07:37:11 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1802596 ']' 00:31:39.606 07:37:11 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:39.606 07:37:11 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:39.606 07:37:11 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:39.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:39.606 07:37:11 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:39.606 07:37:11 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:39.606 [2024-07-25 07:37:11.834809] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:39.606 [2024-07-25 07:37:11.834857] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802596 ] 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:39.606 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:39.606 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:39.606 [2024-07-25 07:37:11.950404] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:39.606 [2024-07-25 07:37:12.041533] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:31:39.606 [2024-07-25 07:37:12.041627] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:31:39.606 [2024-07-25 07:37:12.041632] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:31:40.544 07:37:12 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:40.544 07:37:12 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:31:40.544 07:37:12 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:40.544 07:37:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:40.544 07:37:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:43.868 07:37:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:43.868 07:37:15 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:43.868 07:37:15 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:43.868 07:37:15 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:43.868 07:37:15 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:43.868 07:37:15 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:43.868 07:37:15 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:43.868 07:37:16 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:43.868 [ 00:31:43.868 { 00:31:43.868 "name": "Nvme0n1", 00:31:43.868 "aliases": [ 00:31:43.868 "e95a684b-2f07-44bf-935c-89e1b49e13a4" 00:31:43.868 ], 00:31:43.868 "product_name": "NVMe disk", 00:31:43.868 "block_size": 512, 00:31:43.868 "num_blocks": 3907029168, 00:31:43.868 "uuid": "e95a684b-2f07-44bf-935c-89e1b49e13a4", 00:31:43.868 "assigned_rate_limits": { 00:31:43.868 "rw_ios_per_sec": 0, 00:31:43.868 "rw_mbytes_per_sec": 0, 00:31:43.868 "r_mbytes_per_sec": 0, 00:31:43.868 "w_mbytes_per_sec": 0 00:31:43.868 }, 00:31:43.868 "claimed": false, 00:31:43.868 "zoned": false, 00:31:43.868 "supported_io_types": { 00:31:43.868 "read": true, 00:31:43.868 "write": true, 00:31:43.868 "unmap": true, 00:31:43.868 "flush": true, 00:31:43.868 "reset": true, 00:31:43.868 "nvme_admin": true, 00:31:43.868 "nvme_io": true, 00:31:43.868 "nvme_io_md": false, 00:31:43.868 "write_zeroes": true, 00:31:43.868 "zcopy": false, 00:31:43.868 "get_zone_info": false, 00:31:43.868 "zone_management": false, 00:31:43.868 "zone_append": false, 00:31:43.868 "compare": false, 00:31:43.868 "compare_and_write": false, 00:31:43.868 "abort": true, 00:31:43.868 "seek_hole": false, 00:31:43.868 "seek_data": false, 00:31:43.868 "copy": false, 00:31:43.868 "nvme_iov_md": false 00:31:43.868 }, 00:31:43.868 "driver_specific": { 00:31:43.868 "nvme": [ 00:31:43.868 { 00:31:43.868 "pci_address": "0000:d8:00.0", 00:31:43.868 "trid": { 00:31:43.868 "trtype": "PCIe", 00:31:43.868 "traddr": "0000:d8:00.0" 00:31:43.868 }, 00:31:43.868 "ctrlr_data": { 00:31:43.869 "cntlid": 0, 00:31:43.869 "vendor_id": "0x8086", 00:31:43.869 "model_number": "INTEL SSDPE2KX020T8", 00:31:43.869 "serial_number": "BTLJ125505KA2P0BGN", 00:31:43.869 "firmware_revision": "VDV10170", 00:31:43.869 "oacs": { 00:31:43.869 "security": 0, 00:31:43.869 "format": 1, 00:31:43.869 "firmware": 1, 00:31:43.869 "ns_manage": 1 00:31:43.869 }, 00:31:43.869 "multi_ctrlr": false, 00:31:43.869 "ana_reporting": false 00:31:43.869 }, 00:31:43.869 "vs": { 00:31:43.869 "nvme_version": "1.2" 00:31:43.869 }, 00:31:43.869 "ns_data": { 00:31:43.869 "id": 1, 00:31:43.869 "can_share": false 00:31:43.869 } 00:31:43.869 } 00:31:43.869 ], 00:31:43.869 "mp_policy": "active_passive" 00:31:43.869 } 00:31:43.869 } 00:31:43.869 ] 00:31:43.869 07:37:16 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:43.869 07:37:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:45.247 588c57f2-0e60-404e-9918-6160d0153e82 00:31:45.247 07:37:17 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:45.247 ea40b583-4634-460e-a576-eacf5444495a 00:31:45.247 07:37:17 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:45.247 07:37:17 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:45.247 07:37:17 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:45.247 07:37:17 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:45.247 07:37:17 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:45.247 07:37:17 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:45.247 07:37:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:45.506 07:37:17 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:45.765 [ 00:31:45.765 { 00:31:45.766 "name": "ea40b583-4634-460e-a576-eacf5444495a", 00:31:45.766 "aliases": [ 00:31:45.766 "lvs0/lv0" 00:31:45.766 ], 00:31:45.766 "product_name": "Logical Volume", 00:31:45.766 "block_size": 512, 00:31:45.766 "num_blocks": 204800, 00:31:45.766 "uuid": "ea40b583-4634-460e-a576-eacf5444495a", 00:31:45.766 "assigned_rate_limits": { 00:31:45.766 "rw_ios_per_sec": 0, 00:31:45.766 "rw_mbytes_per_sec": 0, 00:31:45.766 "r_mbytes_per_sec": 0, 00:31:45.766 "w_mbytes_per_sec": 0 00:31:45.766 }, 00:31:45.766 "claimed": false, 00:31:45.766 "zoned": false, 00:31:45.766 "supported_io_types": { 00:31:45.766 "read": true, 00:31:45.766 "write": true, 00:31:45.766 "unmap": true, 00:31:45.766 "flush": false, 00:31:45.766 "reset": true, 00:31:45.766 "nvme_admin": false, 00:31:45.766 "nvme_io": false, 00:31:45.766 "nvme_io_md": false, 00:31:45.766 "write_zeroes": true, 00:31:45.766 "zcopy": false, 00:31:45.766 "get_zone_info": false, 00:31:45.766 "zone_management": false, 00:31:45.766 "zone_append": false, 00:31:45.766 "compare": false, 00:31:45.766 "compare_and_write": false, 00:31:45.766 "abort": false, 00:31:45.766 "seek_hole": true, 00:31:45.766 "seek_data": true, 00:31:45.766 "copy": false, 00:31:45.766 "nvme_iov_md": false 00:31:45.766 }, 00:31:45.766 "driver_specific": { 00:31:45.766 "lvol": { 00:31:45.766 "lvol_store_uuid": "588c57f2-0e60-404e-9918-6160d0153e82", 00:31:45.766 "base_bdev": "Nvme0n1", 00:31:45.766 "thin_provision": true, 00:31:45.766 "num_allocated_clusters": 0, 00:31:45.766 "snapshot": false, 00:31:45.766 "clone": false, 00:31:45.766 "esnap_clone": false 00:31:45.766 } 00:31:45.766 } 00:31:45.766 } 00:31:45.766 ] 00:31:45.766 07:37:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:45.766 07:37:18 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:45.766 07:37:18 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:46.025 [2024-07-25 07:37:18.365685] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:46.025 COMP_lvs0/lv0 00:31:46.025 07:37:18 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:46.025 07:37:18 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:46.025 07:37:18 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:46.025 07:37:18 compress_isal -- common/autotest_common.sh@901 -- # local i 00:31:46.025 07:37:18 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:46.025 07:37:18 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:46.025 07:37:18 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:46.285 07:37:18 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:46.285 [ 00:31:46.285 { 00:31:46.285 "name": "COMP_lvs0/lv0", 00:31:46.285 "aliases": [ 00:31:46.285 "2247c35f-3694-567a-bc7f-abc1570b5d27" 00:31:46.285 ], 00:31:46.285 "product_name": "compress", 00:31:46.285 "block_size": 512, 00:31:46.285 "num_blocks": 200704, 00:31:46.285 "uuid": "2247c35f-3694-567a-bc7f-abc1570b5d27", 00:31:46.285 "assigned_rate_limits": { 00:31:46.285 "rw_ios_per_sec": 0, 00:31:46.285 "rw_mbytes_per_sec": 0, 00:31:46.285 "r_mbytes_per_sec": 0, 00:31:46.285 "w_mbytes_per_sec": 0 00:31:46.285 }, 00:31:46.285 "claimed": false, 00:31:46.285 "zoned": false, 00:31:46.285 "supported_io_types": { 00:31:46.285 "read": true, 00:31:46.285 "write": true, 00:31:46.285 "unmap": false, 00:31:46.285 "flush": false, 00:31:46.285 "reset": false, 00:31:46.285 "nvme_admin": false, 00:31:46.285 "nvme_io": false, 00:31:46.285 "nvme_io_md": false, 00:31:46.285 "write_zeroes": true, 00:31:46.285 "zcopy": false, 00:31:46.285 "get_zone_info": false, 00:31:46.285 "zone_management": false, 00:31:46.285 "zone_append": false, 00:31:46.285 "compare": false, 00:31:46.285 "compare_and_write": false, 00:31:46.285 "abort": false, 00:31:46.285 "seek_hole": false, 00:31:46.285 "seek_data": false, 00:31:46.285 "copy": false, 00:31:46.285 "nvme_iov_md": false 00:31:46.285 }, 00:31:46.285 "driver_specific": { 00:31:46.285 "compress": { 00:31:46.285 "name": "COMP_lvs0/lv0", 00:31:46.285 "base_bdev_name": "ea40b583-4634-460e-a576-eacf5444495a", 00:31:46.285 "pm_path": "/tmp/pmem/d930cb21-a50c-4379-bc27-cde3fc588c81" 00:31:46.285 } 00:31:46.285 } 00:31:46.285 } 00:31:46.285 ] 00:31:46.545 07:37:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:31:46.545 07:37:18 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:46.545 I/O targets: 00:31:46.545 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:46.545 00:31:46.545 00:31:46.545 CUnit - A unit testing framework for C - Version 2.1-3 00:31:46.545 http://cunit.sourceforge.net/ 00:31:46.545 00:31:46.545 00:31:46.545 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:46.545 Test: blockdev write read block ...passed 00:31:46.545 Test: blockdev write zeroes read block ...passed 00:31:46.545 Test: blockdev write zeroes read no split ...passed 00:31:46.545 Test: blockdev write zeroes read split ...passed 00:31:46.545 Test: blockdev write zeroes read split partial ...passed 00:31:46.545 Test: blockdev reset ...[2024-07-25 07:37:18.984870] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:46.545 passed 00:31:46.545 Test: blockdev write read 8 blocks ...passed 00:31:46.545 Test: blockdev write read size > 128k ...passed 00:31:46.545 Test: blockdev write read invalid size ...passed 00:31:46.545 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:46.545 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:46.545 Test: blockdev write read max offset ...passed 00:31:46.545 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:46.545 Test: blockdev writev readv 8 blocks ...passed 00:31:46.545 Test: blockdev writev readv 30 x 1block ...passed 00:31:46.545 Test: blockdev writev readv block ...passed 00:31:46.545 Test: blockdev writev readv size > 128k ...passed 00:31:46.545 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:46.545 Test: blockdev comparev and writev ...passed 00:31:46.545 Test: blockdev nvme passthru rw ...passed 00:31:46.545 Test: blockdev nvme passthru vendor specific ...passed 00:31:46.545 Test: blockdev nvme admin passthru ...passed 00:31:46.545 Test: blockdev copy ...passed 00:31:46.545 00:31:46.545 Run Summary: Type Total Ran Passed Failed Inactive 00:31:46.545 suites 1 1 n/a 0 0 00:31:46.545 tests 23 23 23 0 0 00:31:46.545 asserts 130 130 130 0 n/a 00:31:46.545 00:31:46.546 Elapsed time = 0.193 seconds 00:31:46.546 0 00:31:46.546 07:37:19 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:46.546 07:37:19 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:46.805 07:37:19 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:47.064 07:37:19 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:47.064 07:37:19 compress_isal -- compress/compress.sh@62 -- # killprocess 1802596 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1802596 ']' 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1802596 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@955 -- # uname 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1802596 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1802596' 00:31:47.064 killing process with pid 1802596 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@969 -- # kill 1802596 00:31:47.064 07:37:19 compress_isal -- common/autotest_common.sh@974 -- # wait 1802596 00:31:49.598 07:37:22 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:49.598 07:37:22 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:49.598 00:31:49.598 real 0m49.204s 00:31:49.598 user 1m51.948s 00:31:49.598 sys 0m4.017s 00:31:49.598 07:37:22 compress_isal -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:49.598 07:37:22 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:49.598 ************************************ 00:31:49.598 END TEST compress_isal 00:31:49.598 ************************************ 00:31:49.598 07:37:22 -- spdk/autotest.sh@356 -- # '[' 0 -eq 1 ']' 00:31:49.598 07:37:22 -- spdk/autotest.sh@360 -- # '[' 1 -eq 1 ']' 00:31:49.598 07:37:22 -- spdk/autotest.sh@361 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:49.598 07:37:22 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:49.598 07:37:22 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:49.598 07:37:22 -- common/autotest_common.sh@10 -- # set +x 00:31:49.598 ************************************ 00:31:49.598 START TEST blockdev_crypto_aesni 00:31:49.598 ************************************ 00:31:49.598 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:49.858 * Looking for test storage... 00:31:49.858 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:49.858 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1804466 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1804466 00:31:49.859 07:37:22 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:49.859 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # '[' -z 1804466 ']' 00:31:49.859 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:49.859 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:49.859 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:49.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:49.859 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:49.859 07:37:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:49.859 [2024-07-25 07:37:22.276312] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:49.859 [2024-07-25 07:37:22.276378] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1804466 ] 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:49.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:49.859 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:50.119 [2024-07-25 07:37:22.408405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:50.119 [2024-07-25 07:37:22.494124] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:31:50.687 07:37:23 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:50.687 07:37:23 blockdev_crypto_aesni -- common/autotest_common.sh@864 -- # return 0 00:31:50.687 07:37:23 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:31:50.687 07:37:23 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:31:50.687 07:37:23 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:31:50.687 07:37:23 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:50.687 07:37:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:50.687 [2024-07-25 07:37:23.168230] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:50.687 [2024-07-25 07:37:23.176264] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:50.687 [2024-07-25 07:37:23.184281] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:50.946 [2024-07-25 07:37:23.253174] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:53.482 true 00:31:53.482 true 00:31:53.482 true 00:31:53.482 true 00:31:53.482 Malloc0 00:31:53.482 Malloc1 00:31:53.482 Malloc2 00:31:53.482 Malloc3 00:31:53.482 [2024-07-25 07:37:25.590866] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:53.482 crypto_ram 00:31:53.482 [2024-07-25 07:37:25.598887] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:53.482 crypto_ram2 00:31:53.482 [2024-07-25 07:37:25.606909] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:53.482 crypto_ram3 00:31:53.482 [2024-07-25 07:37:25.614931] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:53.482 crypto_ram4 00:31:53.482 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:53.482 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:31:53.482 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:53.482 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:53.482 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:53.482 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:31:53.482 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:31:53.482 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:53.482 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e06e69e0-c2f6-52de-9be9-e90cb1969468"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e06e69e0-c2f6-52de-9be9-e90cb1969468",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ae1c0768-06ba-51e6-a143-ec1be779558b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae1c0768-06ba-51e6-a143-ec1be779558b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "453a129e-40e1-5bea-921d-05081674c489"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "453a129e-40e1-5bea-921d-05081674c489",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1316ebfc-b180-56b2-b9ca-f38ee6d7dbb3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1316ebfc-b180-56b2-b9ca-f38ee6d7dbb3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:31:53.483 07:37:25 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 1804466 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # '[' -z 1804466 ']' 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # kill -0 1804466 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # uname 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1804466 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1804466' 00:31:53.483 killing process with pid 1804466 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@969 -- # kill 1804466 00:31:53.483 07:37:25 blockdev_crypto_aesni -- common/autotest_common.sh@974 -- # wait 1804466 00:31:54.050 07:37:26 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:54.050 07:37:26 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:54.050 07:37:26 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:31:54.050 07:37:26 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:54.050 07:37:26 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:54.050 ************************************ 00:31:54.050 START TEST bdev_hello_world 00:31:54.050 ************************************ 00:31:54.050 07:37:26 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:54.050 [2024-07-25 07:37:26.429294] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:54.050 [2024-07-25 07:37:26.429352] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1805069 ] 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:54.050 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:54.050 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:54.050 [2024-07-25 07:37:26.559482] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:54.308 [2024-07-25 07:37:26.642987] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:31:54.308 [2024-07-25 07:37:26.664224] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:54.308 [2024-07-25 07:37:26.672245] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:54.308 [2024-07-25 07:37:26.680263] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:54.308 [2024-07-25 07:37:26.792103] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:56.842 [2024-07-25 07:37:28.964993] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:56.842 [2024-07-25 07:37:28.965063] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:56.842 [2024-07-25 07:37:28.965077] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.842 [2024-07-25 07:37:28.973012] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:56.842 [2024-07-25 07:37:28.973029] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:56.842 [2024-07-25 07:37:28.973040] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.842 [2024-07-25 07:37:28.981032] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:56.842 [2024-07-25 07:37:28.981048] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:56.842 [2024-07-25 07:37:28.981058] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.842 [2024-07-25 07:37:28.989052] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:56.842 [2024-07-25 07:37:28.989068] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:56.842 [2024-07-25 07:37:28.989078] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.842 [2024-07-25 07:37:29.060256] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:56.842 [2024-07-25 07:37:29.060299] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:56.842 [2024-07-25 07:37:29.060316] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:56.842 [2024-07-25 07:37:29.061482] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:56.842 [2024-07-25 07:37:29.061555] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:56.842 [2024-07-25 07:37:29.061571] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:56.842 [2024-07-25 07:37:29.061612] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:56.842 00:31:56.842 [2024-07-25 07:37:29.061629] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:57.102 00:31:57.102 real 0m3.003s 00:31:57.102 user 0m2.624s 00:31:57.102 sys 0m0.337s 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:57.102 ************************************ 00:31:57.102 END TEST bdev_hello_world 00:31:57.102 ************************************ 00:31:57.102 07:37:29 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:31:57.102 07:37:29 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:57.102 07:37:29 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:57.102 07:37:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:57.102 ************************************ 00:31:57.102 START TEST bdev_bounds 00:31:57.102 ************************************ 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1805610 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1805610' 00:31:57.102 Process bdevio pid: 1805610 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1805610 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1805610 ']' 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:57.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:57.102 07:37:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:57.102 [2024-07-25 07:37:29.492350] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:31:57.102 [2024-07-25 07:37:29.492393] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1805610 ] 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:57.102 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:57.102 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:57.102 [2024-07-25 07:37:29.610054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:57.361 [2024-07-25 07:37:29.701834] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:31:57.361 [2024-07-25 07:37:29.701928] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:31:57.361 [2024-07-25 07:37:29.701929] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:31:57.361 [2024-07-25 07:37:29.723224] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:57.361 [2024-07-25 07:37:29.731255] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:57.361 [2024-07-25 07:37:29.739277] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:57.361 [2024-07-25 07:37:29.837184] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:59.931 [2024-07-25 07:37:32.008235] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:59.931 [2024-07-25 07:37:32.008327] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:59.931 [2024-07-25 07:37:32.008341] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.931 [2024-07-25 07:37:32.016253] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:59.931 [2024-07-25 07:37:32.016272] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:59.931 [2024-07-25 07:37:32.016284] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.931 [2024-07-25 07:37:32.024275] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:59.931 [2024-07-25 07:37:32.024296] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:59.931 [2024-07-25 07:37:32.024307] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.931 [2024-07-25 07:37:32.032295] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:59.931 [2024-07-25 07:37:32.032312] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:59.931 [2024-07-25 07:37:32.032323] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.931 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:59.931 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:31:59.931 07:37:32 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:59.931 I/O targets: 00:31:59.931 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:31:59.931 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:31:59.931 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:31:59.931 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:31:59.931 00:31:59.931 00:31:59.931 CUnit - A unit testing framework for C - Version 2.1-3 00:31:59.931 http://cunit.sourceforge.net/ 00:31:59.931 00:31:59.931 00:31:59.931 Suite: bdevio tests on: crypto_ram4 00:31:59.931 Test: blockdev write read block ...passed 00:31:59.931 Test: blockdev write zeroes read block ...passed 00:31:59.931 Test: blockdev write zeroes read no split ...passed 00:31:59.931 Test: blockdev write zeroes read split ...passed 00:31:59.931 Test: blockdev write zeroes read split partial ...passed 00:31:59.931 Test: blockdev reset ...passed 00:31:59.931 Test: blockdev write read 8 blocks ...passed 00:31:59.931 Test: blockdev write read size > 128k ...passed 00:31:59.931 Test: blockdev write read invalid size ...passed 00:31:59.931 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:59.931 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:59.931 Test: blockdev write read max offset ...passed 00:31:59.931 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:59.931 Test: blockdev writev readv 8 blocks ...passed 00:31:59.931 Test: blockdev writev readv 30 x 1block ...passed 00:31:59.931 Test: blockdev writev readv block ...passed 00:31:59.931 Test: blockdev writev readv size > 128k ...passed 00:31:59.931 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:59.931 Test: blockdev comparev and writev ...passed 00:31:59.931 Test: blockdev nvme passthru rw ...passed 00:31:59.931 Test: blockdev nvme passthru vendor specific ...passed 00:31:59.931 Test: blockdev nvme admin passthru ...passed 00:31:59.931 Test: blockdev copy ...passed 00:31:59.931 Suite: bdevio tests on: crypto_ram3 00:31:59.931 Test: blockdev write read block ...passed 00:31:59.931 Test: blockdev write zeroes read block ...passed 00:31:59.931 Test: blockdev write zeroes read no split ...passed 00:31:59.931 Test: blockdev write zeroes read split ...passed 00:31:59.931 Test: blockdev write zeroes read split partial ...passed 00:31:59.931 Test: blockdev reset ...passed 00:31:59.931 Test: blockdev write read 8 blocks ...passed 00:31:59.931 Test: blockdev write read size > 128k ...passed 00:31:59.931 Test: blockdev write read invalid size ...passed 00:31:59.931 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:59.931 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:59.931 Test: blockdev write read max offset ...passed 00:31:59.931 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:59.931 Test: blockdev writev readv 8 blocks ...passed 00:31:59.931 Test: blockdev writev readv 30 x 1block ...passed 00:31:59.931 Test: blockdev writev readv block ...passed 00:31:59.932 Test: blockdev writev readv size > 128k ...passed 00:31:59.932 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:59.932 Test: blockdev comparev and writev ...passed 00:31:59.932 Test: blockdev nvme passthru rw ...passed 00:31:59.932 Test: blockdev nvme passthru vendor specific ...passed 00:31:59.932 Test: blockdev nvme admin passthru ...passed 00:31:59.932 Test: blockdev copy ...passed 00:31:59.932 Suite: bdevio tests on: crypto_ram2 00:31:59.932 Test: blockdev write read block ...passed 00:31:59.932 Test: blockdev write zeroes read block ...passed 00:31:59.932 Test: blockdev write zeroes read no split ...passed 00:31:59.932 Test: blockdev write zeroes read split ...passed 00:31:59.932 Test: blockdev write zeroes read split partial ...passed 00:31:59.932 Test: blockdev reset ...passed 00:31:59.932 Test: blockdev write read 8 blocks ...passed 00:31:59.932 Test: blockdev write read size > 128k ...passed 00:31:59.932 Test: blockdev write read invalid size ...passed 00:31:59.932 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:59.932 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:59.932 Test: blockdev write read max offset ...passed 00:31:59.932 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:59.932 Test: blockdev writev readv 8 blocks ...passed 00:31:59.932 Test: blockdev writev readv 30 x 1block ...passed 00:31:59.932 Test: blockdev writev readv block ...passed 00:31:59.932 Test: blockdev writev readv size > 128k ...passed 00:31:59.932 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:59.932 Test: blockdev comparev and writev ...passed 00:31:59.932 Test: blockdev nvme passthru rw ...passed 00:31:59.932 Test: blockdev nvme passthru vendor specific ...passed 00:31:59.932 Test: blockdev nvme admin passthru ...passed 00:31:59.932 Test: blockdev copy ...passed 00:31:59.932 Suite: bdevio tests on: crypto_ram 00:31:59.932 Test: blockdev write read block ...passed 00:31:59.932 Test: blockdev write zeroes read block ...passed 00:31:59.932 Test: blockdev write zeroes read no split ...passed 00:31:59.932 Test: blockdev write zeroes read split ...passed 00:32:00.191 Test: blockdev write zeroes read split partial ...passed 00:32:00.191 Test: blockdev reset ...passed 00:32:00.191 Test: blockdev write read 8 blocks ...passed 00:32:00.191 Test: blockdev write read size > 128k ...passed 00:32:00.191 Test: blockdev write read invalid size ...passed 00:32:00.191 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:00.191 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:00.191 Test: blockdev write read max offset ...passed 00:32:00.191 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:00.191 Test: blockdev writev readv 8 blocks ...passed 00:32:00.191 Test: blockdev writev readv 30 x 1block ...passed 00:32:00.191 Test: blockdev writev readv block ...passed 00:32:00.191 Test: blockdev writev readv size > 128k ...passed 00:32:00.191 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:00.191 Test: blockdev comparev and writev ...passed 00:32:00.191 Test: blockdev nvme passthru rw ...passed 00:32:00.191 Test: blockdev nvme passthru vendor specific ...passed 00:32:00.191 Test: blockdev nvme admin passthru ...passed 00:32:00.191 Test: blockdev copy ...passed 00:32:00.191 00:32:00.191 Run Summary: Type Total Ran Passed Failed Inactive 00:32:00.191 suites 4 4 n/a 0 0 00:32:00.191 tests 92 92 92 0 0 00:32:00.191 asserts 520 520 520 0 n/a 00:32:00.191 00:32:00.191 Elapsed time = 0.507 seconds 00:32:00.191 0 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1805610 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1805610 ']' 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1805610 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1805610 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1805610' 00:32:00.191 killing process with pid 1805610 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1805610 00:32:00.191 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1805610 00:32:00.450 07:37:32 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:32:00.450 00:32:00.450 real 0m3.432s 00:32:00.450 user 0m9.647s 00:32:00.450 sys 0m0.495s 00:32:00.450 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:00.450 07:37:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:00.450 ************************************ 00:32:00.450 END TEST bdev_bounds 00:32:00.450 ************************************ 00:32:00.451 07:37:32 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:00.451 07:37:32 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:32:00.451 07:37:32 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:00.451 07:37:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:00.451 ************************************ 00:32:00.451 START TEST bdev_nbd 00:32:00.451 ************************************ 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1806171 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1806171 /var/tmp/spdk-nbd.sock 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1806171 ']' 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:00.451 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:00.451 07:37:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:00.710 [2024-07-25 07:37:33.037314] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:32:00.710 [2024-07-25 07:37:33.037368] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:00.710 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:00.710 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:00.710 [2024-07-25 07:37:33.171472] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:00.971 [2024-07-25 07:37:33.260501] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:32:00.971 [2024-07-25 07:37:33.281745] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:00.971 [2024-07-25 07:37:33.289766] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:00.971 [2024-07-25 07:37:33.297783] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:00.971 [2024-07-25 07:37:33.406459] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:03.506 [2024-07-25 07:37:35.579274] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:03.506 [2024-07-25 07:37:35.579328] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:03.506 [2024-07-25 07:37:35.579343] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.506 [2024-07-25 07:37:35.587293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:03.506 [2024-07-25 07:37:35.587311] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:03.506 [2024-07-25 07:37:35.587322] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.506 [2024-07-25 07:37:35.595328] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:03.506 [2024-07-25 07:37:35.595344] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:03.506 [2024-07-25 07:37:35.595355] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.506 [2024-07-25 07:37:35.603335] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:03.507 [2024-07-25 07:37:35.603350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:03.507 [2024-07-25 07:37:35.603361] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:03.507 1+0 records in 00:32:03.507 1+0 records out 00:32:03.507 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271484 s, 15.1 MB/s 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:03.507 07:37:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:03.766 1+0 records in 00:32:03.766 1+0 records out 00:32:03.766 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324236 s, 12.6 MB/s 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:03.766 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:04.025 1+0 records in 00:32:04.025 1+0 records out 00:32:04.025 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286043 s, 14.3 MB/s 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:04.025 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:04.284 1+0 records in 00:32:04.284 1+0 records out 00:32:04.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310743 s, 13.2 MB/s 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:04.284 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:04.543 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd0", 00:32:04.543 "bdev_name": "crypto_ram" 00:32:04.543 }, 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd1", 00:32:04.543 "bdev_name": "crypto_ram2" 00:32:04.543 }, 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd2", 00:32:04.543 "bdev_name": "crypto_ram3" 00:32:04.543 }, 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd3", 00:32:04.543 "bdev_name": "crypto_ram4" 00:32:04.543 } 00:32:04.543 ]' 00:32:04.543 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:04.543 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd0", 00:32:04.543 "bdev_name": "crypto_ram" 00:32:04.543 }, 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd1", 00:32:04.543 "bdev_name": "crypto_ram2" 00:32:04.543 }, 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd2", 00:32:04.543 "bdev_name": "crypto_ram3" 00:32:04.543 }, 00:32:04.543 { 00:32:04.543 "nbd_device": "/dev/nbd3", 00:32:04.543 "bdev_name": "crypto_ram4" 00:32:04.543 } 00:32:04.543 ]' 00:32:04.543 07:37:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:04.543 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:04.801 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:04.802 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:05.060 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:05.319 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:05.577 07:37:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:05.577 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:05.577 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:05.577 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:05.836 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:05.837 /dev/nbd0 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:05.837 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:06.095 1+0 records in 00:32:06.095 1+0 records out 00:32:06.095 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277591 s, 14.8 MB/s 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:06.095 /dev/nbd1 00:32:06.095 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:06.354 1+0 records in 00:32:06.354 1+0 records out 00:32:06.354 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224807 s, 18.2 MB/s 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:06.354 /dev/nbd10 00:32:06.354 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:06.613 1+0 records in 00:32:06.613 1+0 records out 00:32:06.613 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336554 s, 12.2 MB/s 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:06.613 07:37:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:06.613 /dev/nbd11 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:06.613 1+0 records in 00:32:06.613 1+0 records out 00:32:06.613 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000369951 s, 11.1 MB/s 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:06.613 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd0", 00:32:06.872 "bdev_name": "crypto_ram" 00:32:06.872 }, 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd1", 00:32:06.872 "bdev_name": "crypto_ram2" 00:32:06.872 }, 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd10", 00:32:06.872 "bdev_name": "crypto_ram3" 00:32:06.872 }, 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd11", 00:32:06.872 "bdev_name": "crypto_ram4" 00:32:06.872 } 00:32:06.872 ]' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd0", 00:32:06.872 "bdev_name": "crypto_ram" 00:32:06.872 }, 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd1", 00:32:06.872 "bdev_name": "crypto_ram2" 00:32:06.872 }, 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd10", 00:32:06.872 "bdev_name": "crypto_ram3" 00:32:06.872 }, 00:32:06.872 { 00:32:06.872 "nbd_device": "/dev/nbd11", 00:32:06.872 "bdev_name": "crypto_ram4" 00:32:06.872 } 00:32:06.872 ]' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:06.872 /dev/nbd1 00:32:06.872 /dev/nbd10 00:32:06.872 /dev/nbd11' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:06.872 /dev/nbd1 00:32:06.872 /dev/nbd10 00:32:06.872 /dev/nbd11' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:06.872 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:07.130 256+0 records in 00:32:07.130 256+0 records out 00:32:07.130 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104069 s, 101 MB/s 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:07.131 256+0 records in 00:32:07.131 256+0 records out 00:32:07.131 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.053315 s, 19.7 MB/s 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:07.131 256+0 records in 00:32:07.131 256+0 records out 00:32:07.131 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.054768 s, 19.1 MB/s 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:07.131 256+0 records in 00:32:07.131 256+0 records out 00:32:07.131 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0460171 s, 22.8 MB/s 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:07.131 256+0 records in 00:32:07.131 256+0 records out 00:32:07.131 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0530809 s, 19.8 MB/s 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:07.131 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:07.389 07:37:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:07.648 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:07.906 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:08.165 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:08.423 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:08.424 07:37:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:08.682 malloc_lvol_verify 00:32:08.682 07:37:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:09.250 2215cb43-2e31-4989-9f11-ccf049ee3c73 00:32:09.250 07:37:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:09.508 6cd85096-445c-4bb5-8dce-196c76cc013e 00:32:09.508 07:37:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:09.765 /dev/nbd0 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:10.023 mke2fs 1.46.5 (30-Dec-2021) 00:32:10.023 Discarding device blocks: 0/4096 done 00:32:10.023 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:10.023 00:32:10.023 Allocating group tables: 0/1 done 00:32:10.023 Writing inode tables: 0/1 done 00:32:10.023 Creating journal (1024 blocks): done 00:32:10.023 Writing superblocks and filesystem accounting information: 0/1 done 00:32:10.023 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:10.023 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1806171 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1806171 ']' 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1806171 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1806171 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1806171' 00:32:10.281 killing process with pid 1806171 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1806171 00:32:10.281 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1806171 00:32:10.540 07:37:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:32:10.540 00:32:10.540 real 0m9.994s 00:32:10.540 user 0m13.166s 00:32:10.540 sys 0m3.726s 00:32:10.540 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:10.540 07:37:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:10.540 ************************************ 00:32:10.540 END TEST bdev_nbd 00:32:10.540 ************************************ 00:32:10.540 07:37:42 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:32:10.540 07:37:43 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:32:10.540 07:37:43 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:10.540 07:37:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:10.540 ************************************ 00:32:10.540 START TEST bdev_fio 00:32:10.540 ************************************ 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:10.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:10.540 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:10.799 ************************************ 00:32:10.799 START TEST bdev_fio_rw_verify 00:32:10.799 ************************************ 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:10.799 07:37:43 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:11.058 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:11.058 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:11.058 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:11.058 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:11.058 fio-3.35 00:32:11.058 Starting 4 threads 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:11.317 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:11.317 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:26.225 00:32:26.225 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1808637: Thu Jul 25 07:37:56 2024 00:32:26.225 read: IOPS=23.5k, BW=92.0MiB/s (96.4MB/s)(920MiB/10001msec) 00:32:26.225 slat (usec): min=15, max=354, avg=56.73, stdev=26.50 00:32:26.225 clat (usec): min=14, max=1873, avg=305.11, stdev=176.94 00:32:26.225 lat (usec): min=43, max=2006, avg=361.83, stdev=190.01 00:32:26.225 clat percentiles (usec): 00:32:26.225 | 50.000th=[ 269], 99.000th=[ 799], 99.900th=[ 979], 99.990th=[ 1254], 00:32:26.225 | 99.999th=[ 1680] 00:32:26.225 write: IOPS=26.0k, BW=101MiB/s (106MB/s)(987MiB/9732msec); 0 zone resets 00:32:26.225 slat (usec): min=21, max=1138, avg=68.19, stdev=26.34 00:32:26.225 clat (usec): min=32, max=1880, avg=369.29, stdev=211.24 00:32:26.225 lat (usec): min=66, max=1998, avg=437.47, stdev=224.54 00:32:26.225 clat percentiles (usec): 00:32:26.225 | 50.000th=[ 334], 99.000th=[ 1004], 99.900th=[ 1237], 99.990th=[ 1385], 00:32:26.225 | 99.999th=[ 1795] 00:32:26.225 bw ( KiB/s): min=86536, max=139580, per=97.82%, avg=101545.05, stdev=3161.46, samples=76 00:32:26.225 iops : min=21634, max=34895, avg=25386.26, stdev=790.36, samples=76 00:32:26.225 lat (usec) : 20=0.01%, 50=0.01%, 100=6.66%, 250=32.32%, 500=42.20% 00:32:26.225 lat (usec) : 750=14.59%, 1000=3.67% 00:32:26.225 lat (msec) : 2=0.56% 00:32:26.225 cpu : usr=99.63%, sys=0.00%, ctx=75, majf=0, minf=272 00:32:26.225 IO depths : 1=10.0%, 2=25.6%, 4=51.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:26.225 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:26.225 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:26.225 issued rwts: total=235419,252560,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:26.225 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:26.225 00:32:26.225 Run status group 0 (all jobs): 00:32:26.225 READ: bw=92.0MiB/s (96.4MB/s), 92.0MiB/s-92.0MiB/s (96.4MB/s-96.4MB/s), io=920MiB (964MB), run=10001-10001msec 00:32:26.225 WRITE: bw=101MiB/s (106MB/s), 101MiB/s-101MiB/s (106MB/s-106MB/s), io=987MiB (1034MB), run=9732-9732msec 00:32:26.225 00:32:26.225 real 0m13.450s 00:32:26.225 user 0m53.260s 00:32:26.225 sys 0m0.475s 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:26.225 ************************************ 00:32:26.225 END TEST bdev_fio_rw_verify 00:32:26.225 ************************************ 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:26.225 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e06e69e0-c2f6-52de-9be9-e90cb1969468"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e06e69e0-c2f6-52de-9be9-e90cb1969468",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ae1c0768-06ba-51e6-a143-ec1be779558b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae1c0768-06ba-51e6-a143-ec1be779558b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "453a129e-40e1-5bea-921d-05081674c489"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "453a129e-40e1-5bea-921d-05081674c489",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1316ebfc-b180-56b2-b9ca-f38ee6d7dbb3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1316ebfc-b180-56b2-b9ca-f38ee6d7dbb3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:32:26.226 crypto_ram2 00:32:26.226 crypto_ram3 00:32:26.226 crypto_ram4 ]] 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e06e69e0-c2f6-52de-9be9-e90cb1969468"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e06e69e0-c2f6-52de-9be9-e90cb1969468",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ae1c0768-06ba-51e6-a143-ec1be779558b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae1c0768-06ba-51e6-a143-ec1be779558b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "453a129e-40e1-5bea-921d-05081674c489"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "453a129e-40e1-5bea-921d-05081674c489",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1316ebfc-b180-56b2-b9ca-f38ee6d7dbb3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1316ebfc-b180-56b2-b9ca-f38ee6d7dbb3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:26.226 ************************************ 00:32:26.226 START TEST bdev_fio_trim 00:32:26.226 ************************************ 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:26.226 07:37:56 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:26.226 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.227 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.227 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.227 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.227 fio-3.35 00:32:26.227 Starting 4 threads 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:26.227 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:26.227 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:38.440 00:32:38.440 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1811123: Thu Jul 25 07:38:09 2024 00:32:38.440 write: IOPS=41.5k, BW=162MiB/s (170MB/s)(1621MiB/10001msec); 0 zone resets 00:32:38.440 slat (usec): min=15, max=1061, avg=55.42, stdev=35.56 00:32:38.440 clat (usec): min=30, max=1711, avg=245.43, stdev=169.44 00:32:38.440 lat (usec): min=46, max=1981, avg=300.85, stdev=192.98 00:32:38.440 clat percentiles (usec): 00:32:38.440 | 50.000th=[ 198], 99.000th=[ 873], 99.900th=[ 1074], 99.990th=[ 1123], 00:32:38.440 | 99.999th=[ 1500] 00:32:38.440 bw ( KiB/s): min=155512, max=190784, per=100.00%, avg=166297.68, stdev=3131.36, samples=76 00:32:38.440 iops : min=38878, max=47696, avg=41574.42, stdev=782.84, samples=76 00:32:38.440 trim: IOPS=41.5k, BW=162MiB/s (170MB/s)(1621MiB/10001msec); 0 zone resets 00:32:38.440 slat (usec): min=5, max=396, avg=14.50, stdev= 6.28 00:32:38.440 clat (usec): min=46, max=1291, avg=231.91, stdev=109.68 00:32:38.440 lat (usec): min=52, max=1302, avg=246.41, stdev=111.89 00:32:38.440 clat percentiles (usec): 00:32:38.440 | 50.000th=[ 212], 99.000th=[ 611], 99.900th=[ 709], 99.990th=[ 766], 00:32:38.440 | 99.999th=[ 1090] 00:32:38.440 bw ( KiB/s): min=155504, max=190800, per=100.00%, avg=166298.95, stdev=3131.25, samples=76 00:32:38.440 iops : min=38876, max=47700, avg=41574.74, stdev=782.81, samples=76 00:32:38.440 lat (usec) : 50=0.01%, 100=9.29%, 250=56.78%, 500=27.73%, 750=5.07% 00:32:38.440 lat (usec) : 1000=0.99% 00:32:38.440 lat (msec) : 2=0.13% 00:32:38.440 cpu : usr=99.60%, sys=0.01%, ctx=64, majf=0, minf=90 00:32:38.440 IO depths : 1=8.2%, 2=26.2%, 4=52.4%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:38.440 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:38.440 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:38.440 issued rwts: total=0,414954,414954,0 short=0,0,0,0 dropped=0,0,0,0 00:32:38.440 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:38.440 00:32:38.440 Run status group 0 (all jobs): 00:32:38.440 WRITE: bw=162MiB/s (170MB/s), 162MiB/s-162MiB/s (170MB/s-170MB/s), io=1621MiB (1700MB), run=10001-10001msec 00:32:38.440 TRIM: bw=162MiB/s (170MB/s), 162MiB/s-162MiB/s (170MB/s-170MB/s), io=1621MiB (1700MB), run=10001-10001msec 00:32:38.440 00:32:38.440 real 0m13.495s 00:32:38.440 user 0m53.284s 00:32:38.440 sys 0m0.491s 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:38.440 ************************************ 00:32:38.440 END TEST bdev_fio_trim 00:32:38.440 ************************************ 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:32:38.440 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:32:38.440 00:32:38.440 real 0m27.287s 00:32:38.440 user 1m46.728s 00:32:38.440 sys 0m1.146s 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:38.440 ************************************ 00:32:38.440 END TEST bdev_fio 00:32:38.440 ************************************ 00:32:38.440 07:38:10 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:38.440 07:38:10 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:38.440 07:38:10 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:32:38.440 07:38:10 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:38.440 07:38:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:38.440 ************************************ 00:32:38.440 START TEST bdev_verify 00:32:38.440 ************************************ 00:32:38.440 07:38:10 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:38.440 [2024-07-25 07:38:10.446217] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:32:38.440 [2024-07-25 07:38:10.446276] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1812743 ] 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:38.440 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.440 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:38.441 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:38.441 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:38.441 [2024-07-25 07:38:10.577603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:38.441 [2024-07-25 07:38:10.661401] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:32:38.441 [2024-07-25 07:38:10.661407] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:32:38.441 [2024-07-25 07:38:10.682719] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:38.441 [2024-07-25 07:38:10.690742] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:38.441 [2024-07-25 07:38:10.698761] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:38.441 [2024-07-25 07:38:10.816169] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:40.978 [2024-07-25 07:38:12.985322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:40.978 [2024-07-25 07:38:12.985395] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:40.978 [2024-07-25 07:38:12.985408] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.978 [2024-07-25 07:38:12.993338] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:40.978 [2024-07-25 07:38:12.993356] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:40.978 [2024-07-25 07:38:12.993366] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.978 [2024-07-25 07:38:13.001362] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:40.978 [2024-07-25 07:38:13.001378] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:40.978 [2024-07-25 07:38:13.001388] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.978 [2024-07-25 07:38:13.009384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:40.978 [2024-07-25 07:38:13.009400] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:40.978 [2024-07-25 07:38:13.009411] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.978 Running I/O for 5 seconds... 00:32:46.252 00:32:46.252 Latency(us) 00:32:46.252 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:46.252 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x0 length 0x1000 00:32:46.252 crypto_ram : 5.06 531.03 2.07 0.00 0.00 240304.25 16462.64 161061.27 00:32:46.252 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x1000 length 0x1000 00:32:46.252 crypto_ram : 5.06 531.07 2.07 0.00 0.00 240104.38 19084.08 161061.27 00:32:46.252 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x0 length 0x1000 00:32:46.252 crypto_ram2 : 5.07 530.65 2.07 0.00 0.00 239617.94 8808.04 151833.80 00:32:46.252 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x1000 length 0x1000 00:32:46.252 crypto_ram2 : 5.06 530.83 2.07 0.00 0.00 239385.13 8808.04 151833.80 00:32:46.252 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x0 length 0x1000 00:32:46.252 crypto_ram3 : 5.05 4165.83 16.27 0.00 0.00 30413.60 2949.12 27053.26 00:32:46.252 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x1000 length 0x1000 00:32:46.252 crypto_ram3 : 5.05 4182.77 16.34 0.00 0.00 30302.32 7602.18 27262.98 00:32:46.252 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x0 length 0x1000 00:32:46.252 crypto_ram4 : 5.06 4177.10 16.32 0.00 0.00 30285.48 2110.26 27262.98 00:32:46.252 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:46.252 Verification LBA range: start 0x1000 length 0x1000 00:32:46.252 crypto_ram4 : 5.06 4200.50 16.41 0.00 0.00 30113.41 2044.72 26738.69 00:32:46.252 =================================================================================================================== 00:32:46.252 Total : 18849.78 73.63 0.00 0.00 53927.58 2044.72 161061.27 00:32:46.252 00:32:46.252 real 0m8.120s 00:32:46.252 user 0m15.454s 00:32:46.252 sys 0m0.333s 00:32:46.252 07:38:18 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:46.252 07:38:18 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:46.252 ************************************ 00:32:46.252 END TEST bdev_verify 00:32:46.252 ************************************ 00:32:46.252 07:38:18 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:46.252 07:38:18 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:32:46.252 07:38:18 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:46.252 07:38:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:46.252 ************************************ 00:32:46.252 START TEST bdev_verify_big_io 00:32:46.252 ************************************ 00:32:46.252 07:38:18 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:46.252 [2024-07-25 07:38:18.650540] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:32:46.252 [2024-07-25 07:38:18.650594] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1814077 ] 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:46.252 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.252 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:46.253 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:46.253 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:46.253 [2024-07-25 07:38:18.781980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:46.512 [2024-07-25 07:38:18.865741] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:32:46.512 [2024-07-25 07:38:18.865746] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:32:46.512 [2024-07-25 07:38:18.887159] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:46.512 [2024-07-25 07:38:18.895180] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:46.512 [2024-07-25 07:38:18.903200] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:46.512 [2024-07-25 07:38:19.017918] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:49.057 [2024-07-25 07:38:21.187637] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:49.057 [2024-07-25 07:38:21.187712] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:49.057 [2024-07-25 07:38:21.187727] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.057 [2024-07-25 07:38:21.195652] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:49.057 [2024-07-25 07:38:21.195669] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:49.057 [2024-07-25 07:38:21.195680] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.057 [2024-07-25 07:38:21.203674] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:49.057 [2024-07-25 07:38:21.203690] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:49.057 [2024-07-25 07:38:21.203700] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.057 [2024-07-25 07:38:21.211697] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:49.057 [2024-07-25 07:38:21.211713] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:49.057 [2024-07-25 07:38:21.211724] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.057 Running I/O for 5 seconds... 00:32:51.594 [2024-07-25 07:38:23.684373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.685465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.686675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.687868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.689223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.689589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.691089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.692621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.694474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.695199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.696412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.697853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.699267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.700443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.701649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.594 [2024-07-25 07:38:23.703075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.704057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.705598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.707089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.708618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.710517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.711736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.713179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.714604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.716054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.717267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.718705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.720136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.722841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.724293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.725718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.726811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.728323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.729751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.731182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.731799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.734780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.736296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.737773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.738516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.740295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.741731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.742726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.743083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.745765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.747210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.747692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.749082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.750921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.752352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.752709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.753063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.755477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.756493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.757700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.758868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.760691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.761069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.761432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.762026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.764514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.765312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.765362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.766552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.768322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.769104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.769162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.769513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.770730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.772285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.772333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.773585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.774006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.775223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.775267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.776691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.777972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.778349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.778394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.779613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.779975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.781444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.781490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.782445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.783522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.784964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.785010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.785468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.785965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.786557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.786602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.787801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.788822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.789930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.789974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.791172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.791537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.792982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.793037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.793401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.795894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.795948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.797113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.797163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.798667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.798720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.800036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.595 [2024-07-25 07:38:23.800081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.803227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.803281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.804705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.804756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.806670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.806730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.808008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.808051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.810509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.810563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.811468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.811509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.813235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.813289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.813812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.813857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.816506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.816567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.816929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.816973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.818760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.818814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.819179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.819222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.821511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.821564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.822555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.822601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.823640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.823694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.824048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.824087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.825491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.825546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.826656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.826701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.827387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.827442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.827797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.827837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.829685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.829739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.830822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.830868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.831605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.831656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.832019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.832057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.834426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.834483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.836025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.836071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.836860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.836912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.837279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.837326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.839010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.839070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.839440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.839494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.840263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.840316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.840672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.840717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.842473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.842536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.842899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.842957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.843821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.843873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.844243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.844295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.846201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.846257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.846631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.846676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.847545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.847597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.847953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.848025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.849960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.850022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.850390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.850443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.851247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.851301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.851655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.851700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.853424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.853478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.853838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.596 [2024-07-25 07:38:23.853884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.854686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.854738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.855095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.855152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.856783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.856836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.857209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.857255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.858020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.858087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.858450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.858505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.860117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.860177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.860534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.860582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.861349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.861402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.861756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.861809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.863459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.863513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.863869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.863913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.864672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.864725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.865095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.865136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.866750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.866802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.867170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.867216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.867927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.867978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.868340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.868380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.869966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.870020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.870388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.870434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.870451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.870724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.871187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.871239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.871606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.871645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.871667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.872020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.872959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.873335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.873377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.873735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.874013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.874159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.874520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.874578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.874934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.875281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.876971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.877025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.877075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.877423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.878544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.878592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.878632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.878671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.878958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.879095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.879146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.879184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.879228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.879466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.880985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.881024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.881061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.881497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.882593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.882641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.882678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.597 [2024-07-25 07:38:23.882748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.882986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.883123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.883174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.883214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.883252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.883547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.996856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.996917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.998011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.998052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.998974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:23.999029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.000365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.000409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.002002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.002059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.003268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.003312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.005129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.005188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.005950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.005992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.008711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.008770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.010291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.010334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.011612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.011664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.012847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.012889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.014287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.014338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.014795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.016027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.017921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.017976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.019216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.020422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.022606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.022657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.023012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.024844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.024903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.026451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.598 [2024-07-25 07:38:24.027431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.028771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.030159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.031597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.031966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.032345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.032708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.034175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.036070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.037556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.039100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.040633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.041323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.041695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.042846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.044045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.046307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.047518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.048738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.050174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.051011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.051792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.053006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.054227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.056473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.057726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.059114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.059860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.060729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.061956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.063245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.064681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.066966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.068406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.068457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.069309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.070192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.071483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.071534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.072761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.073899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.075117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.075173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.076387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.076760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.077332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.598 [2024-07-25 07:38:24.077384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.077744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.078770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.080221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.080272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.081186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.081555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.082990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.083049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.084599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.085816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.086770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.086821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.088043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.088473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.089932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.089988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.090620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.091662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.093157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.093209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.093568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.094130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.095676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.095733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.097028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.098097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.099315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.099365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.100596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.100967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.101428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.101480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.101839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.102866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.104311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.104362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.105189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.105575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.107087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.107150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.108630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.109792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.110874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.110924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.112112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.112544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.113250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.113302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.114496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.115619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.115991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.116039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.116779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.117196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.118295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.118345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.119525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.120708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.121080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.121146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.122666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.123041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.123475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.123529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.599 [2024-07-25 07:38:24.124644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.125754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.126586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.126640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.127750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.128174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.129478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.130601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.130652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.132042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.133537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.135030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.135089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.136765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.138154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.138208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.138568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.140883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.142198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.142250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.143332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.861 [2024-07-25 07:38:24.144091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.144160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.144795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.145846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.148203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.148262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.148633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.148994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.149430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.150550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.151723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.151773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.152843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.153226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.153590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.153636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.154368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.154740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.154791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.155155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.156786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.157177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.157242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.157611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.158463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.158522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.158890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.159276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.160927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.161006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.161392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.161756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.162190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.162568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.162939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.162990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.164318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.164706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.165071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.165121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.165883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.166277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.166332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.166692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.168286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.168656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.168716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.169079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.169879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.169948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.170320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.170688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.172360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.172426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.172790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.173159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.173701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.174075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.174455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.174508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.175878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.176272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.176638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.176688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.177556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.177934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.177987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.178355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.180035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.180414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.180466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.180841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.181626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:51.862 [2024-07-25 07:38:24.182089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.182459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.182507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.184082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.184110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.184476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.184838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.184882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.185189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.185636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.186009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.186072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.186443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.188075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.188462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.188507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.862 [2024-07-25 07:38:24.188863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.189217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.189359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.189727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.189772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.190122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.191385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.191749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.191791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.192152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.192497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.192645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.193012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.193057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.193418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.194459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.195188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.195233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.196320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.196569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.196707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.197067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.197110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.197470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.198546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.200051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.200100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.200144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.200392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.200532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.201405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.201452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.201489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.202599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.203774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.203818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.204919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.205237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.205374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.206935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.206977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.208491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.209993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.210660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.211958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.212757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.214855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.216845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.217948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.219239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.219283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.220017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.220267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.220404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.221305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.221357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.222730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.224362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.225852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.225913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.227250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.227605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.863 [2024-07-25 07:38:24.227741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.229164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.229207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.230120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.231173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.231539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.231578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.233042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.233319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.233455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.234056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.234101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.235309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.236383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.237909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.237974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.238338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.238585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.238722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.239933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.239977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.241265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.242386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.243230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.243274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.243961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.244213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.244348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.245553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.245597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.246873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.247926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.249420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.249471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.250936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.251185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.251322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.252643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.252687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.253063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.254268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.255832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.255883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.257309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.257630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.257767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.259091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.259136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.260426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.261538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.262234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.262280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.263735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.264158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.264296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.265782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.265838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.267242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.268365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.269575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.269619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.270842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.271088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.271230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.272128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.272178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.273417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.274914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.276208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.276252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.277490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.277735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.277874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.279066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.279109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.280498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.281635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.282488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.282533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.283555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.283868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.283999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.284045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.284567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.284609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.285703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.285759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.287202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.287246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.287573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.287711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.864 [2024-07-25 07:38:24.288921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.288965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.289003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.290005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.291298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.291342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.291386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.291811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.293270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.293319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.293357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.294068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.296555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.296614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.296653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.297821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.298115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.298252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.298297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.299520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.299563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.301977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.302025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.303013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.303056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.303361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.303496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.304730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.304775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.304812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.305935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.307342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.307390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.307428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.307670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.308954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.309005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.309046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.310354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.313418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.313475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.313516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.314776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.315020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.315158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.315203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.316269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.316311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.320035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.320084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.321158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.321199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.321463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.321599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.322191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.322236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.322277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.326011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.327441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.327490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.327528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.327770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.329327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.329378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.329415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.330327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.333317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.333370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.333411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.334841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.335089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.335231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.335279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.336320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.336363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.340028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.340078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.340583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.340626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.340875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.341009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.342236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.342280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.342316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.345772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.865 [2024-07-25 07:38:24.347854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.348222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.351935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.351994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.352032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.353252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.353500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.353635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.353681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.355110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.355158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.358860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.358908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.360387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.360428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.360673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.362217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.362274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.363592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.363634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.368626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.368685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.369081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.369124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.369374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.369889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.369944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.371155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.371199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.375731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.375784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.377220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.377263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.377591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.378964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.379015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.379892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.379934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.384433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.384487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.385655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.386856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.387145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.388678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.388729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.389552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:51.866 [2024-07-25 07:38:24.390617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.395835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.395901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.397244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.397454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.398732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.398783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.399993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.403421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.403829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.405154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.406359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.406676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.406809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.408249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.408815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.410011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.414081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.414773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.415824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.417024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.417316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.418850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.419669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.421217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.422687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.427875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.428516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.429733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.430960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.431213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.432559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.433754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.127 [2024-07-25 07:38:24.434959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.436183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.437521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.438873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.439260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.440567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.440816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.442365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.443854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.444767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.445960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.448151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.448637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.448680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.450184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.450626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.452265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.453791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.453842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.455327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.458371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.459567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.459612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.461145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.461584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.461721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.462923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.462964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.463635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.467739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.468941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.468986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.470205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.470454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.470590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.471411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.471456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.472479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.475923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.477383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.477428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.478394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.478644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.478779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.480145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.480199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.481626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.484075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.484946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.484991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.485883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.486135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.486279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.487554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.487598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.488607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.493185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.494476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.494528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.496064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.496459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.496598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.497497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.497544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.498619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.501197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.502079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.502124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.502875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.503130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.503273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.504393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.504439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.505027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.506778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.508239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.508298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.509081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.509420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.509560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.510775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.510823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.511180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.515075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.516557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.516609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.517811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.518061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.518209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.518992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.128 [2024-07-25 07:38:24.519035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.519794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.523073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.523720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.523767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.524877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.525127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.525272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.526796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.526847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.527366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.529452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.529859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.529905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.531199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.531547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.531687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.532647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.533461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.533505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.537383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.538369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.539077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.539119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.539372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.539897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.541253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.541301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.541661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.544735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.545104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.545156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.545515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.545770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.546448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.546500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.548061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.548431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.551468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.551522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.551884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.552252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.552551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.552690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.553326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.554651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.554694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.556977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.558091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.558676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.558719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.559063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.559525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.560494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.560542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.561087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.564382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.565228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.565273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.565970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.566263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.566716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.566774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.567676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.568466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.569821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.569883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.570523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.571567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.571854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.571991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.572702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.573065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.573109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.575611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.575981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.576354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.576433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.576806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.577846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.578873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.578919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.579434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.583257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.583625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.583679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.584041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.584428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.585639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.129 [2024-07-25 07:38:24.585690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.586704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.587392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.591477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.591529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.591885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.592262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.592586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.592727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.593756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.594658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.594701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.596639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.597421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.598330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.598372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.598689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.599147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.599838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.599882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.600733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.604477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.605069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.605115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.606073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.606336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.606788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.607292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.608482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.609225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.611422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.612882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.613246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.613610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.613948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.614085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.615635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.616052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.616094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.617926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.618308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.619704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.619746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.620094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.620554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.620611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.621044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.621088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.623710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.623769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.624931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.624976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.625300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.626448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.626499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.627594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.627636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.632016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.632070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.633113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.633174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.633422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.633872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.633922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.635095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.635136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.639449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.639503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.640355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.640398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.640695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.641917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.641968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.642336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.642384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.645594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.645643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.646396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.646440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.646684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.646823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.646866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.647961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.648003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.652481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.652534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.652572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.652610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.652933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.654576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.130 [2024-07-25 07:38:24.654628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.654666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.654704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.131 [2024-07-25 07:38:24.658963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.660693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.660765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.660809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.660852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.661096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.661243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.661287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.661326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.661363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.664629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.664682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.664720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.664757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.665005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.665150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.665194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.665235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.665273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.668787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.668836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.670166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.670214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.670477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.670616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.670661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.671608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.671653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.675961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.676014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.677543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.677592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.677898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.678945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.679001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.679681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.679725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.683292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.683357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.684585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.684627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.684920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.686459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.686511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.687204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.687246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.691843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.691897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.693071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.693113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.693418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.694800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.694850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.696274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.696318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.393 [2024-07-25 07:38:24.700905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.700959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.702173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.702217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.702464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.703472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.703523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.704938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.704984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.708843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.708897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.709570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.709614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.709861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.711152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.711203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.712423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.712465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.716998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.717062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.718106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.718154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.718518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.719932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.719985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.720394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.720438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.725693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.725746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.726969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.727013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.727267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.728337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.728392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.729741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.729782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.735032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.735091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.736395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.736436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.736688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.737990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.738040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.739245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.739288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.742063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.742116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.743312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.743354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.743650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.745199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.745250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.746050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.746099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.751149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.751201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.752164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.752206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.752573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.753772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.753822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.753865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.755233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.760679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.760739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.760783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.762023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.762277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.762415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.762465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.762836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.762877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.765652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.765709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.767164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.767206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.767452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.767587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.768792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.768836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.768873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.773028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.773999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.774042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.394 [2024-07-25 07:38:24.774083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.774336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.775624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.775675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.775712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.776918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.781432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.781485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.781534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.782571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.782885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.783025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.783068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.784511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.784558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.788230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.788288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.789635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.789679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.789926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.790071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.791517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.791560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.791598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.796012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.797366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.797412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.797449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.797696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.799238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.799289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.799327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.800214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.805344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.805396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.805434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.806362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.806672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.806812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.806855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.808114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.808160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.811341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.811398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.812792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.812839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.813085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.813230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.814349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.814396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.814440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.816182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.817693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.817737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.817778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.818025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.819408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.819457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.819495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.820959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.824481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.824533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.824571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.825464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.825717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.825851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.825894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.827101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.827147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.831082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.831145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.832619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.832685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.832930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.833064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.833107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.833157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.833214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.834936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.834991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.835038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.835075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.835327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.395 [2024-07-25 07:38:24.836864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.836913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.836952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.837942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.843205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.843260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.843298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.844018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.844276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.844410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.845257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.845302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.846497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.850134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.851372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.851416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.852849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.853182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.853317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.854231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.854278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.855463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.858793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.859749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.859793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.861293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.861544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.861681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.863164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.863220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.864550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.867200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.868399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.868442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.869669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.869916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.870047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.870645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.870690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.871891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.876233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.876879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.876923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.878194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.878488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.879813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.881249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.881292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.881879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.885975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.886810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.887923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.888510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.888769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.888906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.890108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.891318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.892747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.897312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.898361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.899228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.900060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.900318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.901621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.902859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.904295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.905228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.910475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.910894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.912184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.912644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.912894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.914438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.915931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.917284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.918283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.396 [2024-07-25 07:38:24.922424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.923842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.924208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.925685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.925963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.927187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.927781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.929127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.930575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.933260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.934679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.934738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.936129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.936462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.937677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.938970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.939015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.940451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.944087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.944920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.944975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.683 [2024-07-25 07:38:24.946438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.946728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.946867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.947626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.947671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.948436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.951664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.952289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.952336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.953445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.953696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.953835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.955360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.955412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.955889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.958480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.959788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.959835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.961094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.961486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.961630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.962532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.962578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.963763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.967332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.968230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.968278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.969076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.969330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.969470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.969961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.970003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.971065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.973997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.975458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.975516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.976455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.976763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.976903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.978171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.978215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.978572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.981815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.982404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.982447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.983943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.984392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.984534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.986037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.986080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.987424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.990105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.990864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.990908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.991692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.991948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.992087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.992554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.992606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.993745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.997050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.998563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.998617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.999071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.999326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.999468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.999832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:24.999883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.000256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.004753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.005131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.005192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.005549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.005798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.005936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.006542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.006587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.007519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.010779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.012075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.012119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.012483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.012821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.012961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.013387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.684 [2024-07-25 07:38:25.014652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.014696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.017521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.018981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.019438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.019482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.019729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.020192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.020564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.020608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.021026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.023226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.024547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.024592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.024952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.025206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.025658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.025718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.026077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.026446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.028946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.029006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.029373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.029735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.030034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.030181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.030546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.030918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.030962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.033088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.033471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.033836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.033880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.034258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.034710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.035093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.035145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.035503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.037789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.038167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.038215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.038572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.038912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.039371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.039429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.039786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.040155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.042349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.042418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.042780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.043146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.043416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.043556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.043926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.044297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.044343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.046311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.046699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.047067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.047111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.047487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.047939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.048316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.048365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.048722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.051027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.051406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.051454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.051944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.052200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.053679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.053736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.054513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.055423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.058907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.058969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.059880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.060786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.061036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.061181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.061541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.061915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.685 [2024-07-25 07:38:25.061954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.065320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.065953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.067276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.067320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.067630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.068708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.069535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.069581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.070471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.074527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.075840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.075887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.076252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.076502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.077438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.078243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.079764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.080843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.084468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.085966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.087266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.087655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.087966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.088103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.088711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.089919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.089962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.093059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.093434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.094857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.094907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.095161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.096380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.096436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.097737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.097781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.101462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.101520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.102133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.102191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.102476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.103209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.103262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.103616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.103655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.106537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.106592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.107826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.107868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.108125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.109026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.109078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.110325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.110367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.114691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.114752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.116234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.116277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.116525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.117829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.117882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.119091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.119134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.121549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.121598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.122824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.122872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.123165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.123304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.123348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.124158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.124201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.128336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.128390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.128429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.128466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.128712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.130051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.130103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.130154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.130193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.133721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.133771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.133809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.133846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.134134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.134279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.686 [2024-07-25 07:38:25.134322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.134359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.134396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.135632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.135681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.135719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.135766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.136015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.136160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.136211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.136259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.136296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.137968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.138007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.138044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.139254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.139304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.139662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.139704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.139951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.140093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.140136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.141366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.141408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.143678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.143732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.144939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.144982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.145281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.145927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.145977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.146338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.146379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.148750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.148804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.149633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.149678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.149924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.151401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.151454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.152703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.152746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.155494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.155554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.157059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.157109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.157372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.158810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.158869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.159896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.159939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.162234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.162289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.163569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.163609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.163913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.165030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.165079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.166066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.166110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.168386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.168441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.169842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.169889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.170134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.171786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.171839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.173286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.173328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.174844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.174896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.176301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.176347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.176593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.178223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.178274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.179725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.179767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.182156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.182211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.687 [2024-07-25 07:38:25.182918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.182958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.183283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.184148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.184199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.185409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.185452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.188064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.188124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.189619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.189662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.189909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.191474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.191533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.191885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.191925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.194435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.194489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.195719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.195763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.196052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.197343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.197395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.198608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.198650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.200402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.200467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.201673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.201716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.201964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.203396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.203448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.203489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.204542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.206953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.207007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.207049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.207414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.207738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.207873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.207917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.209272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.209319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.210526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.210574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.211384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.211433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.211720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.211857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.213086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.213130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.688 [2024-07-25 07:38:25.213173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.214737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.215336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.215380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.215418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.215703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.217025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.217075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.217113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.218338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.222945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.222998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.223037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.223398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.223649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.223784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.223827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.225018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.225061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.228790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.228847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.230366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.230410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.230656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.230794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.231164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.231225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.231264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.232491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.233853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.233906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.233945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.234200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.235269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.235319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.235357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.236564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.238025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.238077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.238115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.238749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.239001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.239146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.239191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.240401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.240443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.241543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.241595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.243035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.243086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.243339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.243478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.244984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.245036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.245075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.246461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.247778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.951 [2024-07-25 07:38:25.247823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.247867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.248115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.249544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.249602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.249644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.250826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.253196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.253262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.253301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.253655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.254027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.254170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.254215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.255397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.255440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.256522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.256570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.257979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.258027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.258283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.258416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.258465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.258509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.258551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.259644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.259695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.259737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.259789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.260250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.261822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.261879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.261921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.263173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.265434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.265486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.265524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.266752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.267060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.267203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.268208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.268253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.269298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.270502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.271704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.271749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.272959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.273213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.273349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.274518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.274561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.276084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.277241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.277926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.277968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.278330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.278581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.278715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.279595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.279642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.280649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.281734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.282100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.282149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.282502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.282749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.282884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.284097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.284152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.284727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.287108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.287829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.287875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.288696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.288946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.289921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.291169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.291216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.292426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.293672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.294039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.295414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.296393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.296732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.296873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.298229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.299564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.299926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.303506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.304606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.306112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.952 [2024-07-25 07:38:25.307656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.308013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.308475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.309305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.310216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.311482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.314783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.315645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.316458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.317338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.317702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.318158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.319368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.319955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.321178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.322604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.322969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.323917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.325134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.325430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.326559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.327821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.329023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.330256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.332515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.333938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.333984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.334494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.334845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.336369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.337132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.337191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.337544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.338735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.340210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.340269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.340629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.340993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.341131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.341505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.341574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.341937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.343354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.343728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.343781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.344147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.344540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.344678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.345040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.345092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.345468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.347491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.347864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.347935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.348301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.348733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.348871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.349244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.349308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.349668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.351295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.351672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.351717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.352078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.352529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.352666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.353027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.353078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.353452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.354880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.355259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.355307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.355665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.356092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.356246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.356607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.356660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.357017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.358400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.358768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.358813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.359179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.359589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.359726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.360084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.360147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.360508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.361821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.362198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.362247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.362604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.953 [2024-07-25 07:38:25.362978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.363118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.363496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.363538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.363898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.365219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.365586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.365632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.365987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.366385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.366524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.366897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.366937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.367305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.368549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.368916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.368962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.369328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.369715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.369853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.370230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.370272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.370632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.372672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.373046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.373091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.373456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.373794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.373933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.374306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.374666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.374714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.376097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.376477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.376835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.376879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.377189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.378560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.379025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.379070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.380137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.382292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.383312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.383359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.384841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.385183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.386043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.386095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.386459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.387043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.391390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.391446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.391801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.392678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.392981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.393118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.394243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.395477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.395522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.396756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.397123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.398577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.398638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.398886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.399492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.400632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.400698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.402048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.404421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.405752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.405798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.406701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.406999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.408208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.408277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.408642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.409000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.411311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.411366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.412600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.413042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.413364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.413501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.414421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.415326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.415372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.416416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.417344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.418768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.418813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.419150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.419603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.954 [2024-07-25 07:38:25.419973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.420018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.421220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.422764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.423132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.423184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.424393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.424644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.426028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.426079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.427076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.428295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.429820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.429887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.430253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.430620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.430942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.431079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.431937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.433181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.433225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.434431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.434932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.435298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.435342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.435601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.436802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.438088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.438134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.439569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.442008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.443452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.443497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.443933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.444243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.445078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.446303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.447528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.448963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.451248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.452692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.453381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.453738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.454090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.454236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.455447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.456682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.456725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.457763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.458975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.460188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.460231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.460479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.461120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.461176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.461531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.461572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.464047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.464100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.465017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.465059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.465311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.466830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.466889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.468434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.468481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.470523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.470577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.471772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.471815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.472114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.473648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.473700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.474299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.474342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.476886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.476940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.955 [2024-07-25 07:38:25.477296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.956 [2024-07-25 07:38:25.477337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.956 [2024-07-25 07:38:25.477808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.956 [2024-07-25 07:38:25.479361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.956 [2024-07-25 07:38:25.479419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.956 [2024-07-25 07:38:25.480967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:52.956 [2024-07-25 07:38:25.481011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.482171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.482219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.483436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.483478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.483805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.483945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.483987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.485432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.485478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.487939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.487993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.488030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.488068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.488403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.489941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.489992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.490030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.490066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.491869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.493903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.495748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.496910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.496959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.497326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.497367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.497617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.497754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.497806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.218 [2024-07-25 07:38:25.499275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.499325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.501637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.501691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.502918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.502962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.503217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.504335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.504397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.504750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.504789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.507225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.507284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.508696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.508740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.508990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.510286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.510341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.511555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.511599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.513380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.513434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.514625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.514669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.514984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.516528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.516579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.517371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.517417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.519917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.519971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.520340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.520382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.520730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.521910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.521961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.523168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.523211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.525802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.525864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.527249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.527296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.527544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.528955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.529006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.529366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.529414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.531865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.531921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.533345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.533391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.533706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.535004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.535054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.536279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.536322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.537793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.537845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.539281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.539330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.539578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.541214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.541267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.542501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.542543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.545082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.545136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.545772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.545813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.546149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.546886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.546935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.548124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.548173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.550313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.550366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.551560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.551602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.551924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.553464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.553515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.553875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.553915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.556276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.556328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.557754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.557796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.558179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.559755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.559817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.559861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.219 [2024-07-25 07:38:25.561082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.562566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.562617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.562659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.564080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.564387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.564525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.564577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.566071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.566122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.567235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.567283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.568494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.568537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.568784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.568921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.569542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.569586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.569628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.570733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.571959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.572004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.572042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.572295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.573222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.573275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.573314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.574514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.575869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.575921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.575961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.576320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.576570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.576706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.576749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.578010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.578053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.579211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.579259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.580457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.580498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.580752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.580888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.582335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.582379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.582417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.584299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.585529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.585574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.585615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.585881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.587415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.587466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.587504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.588264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.590804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.590857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.590896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.591260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.591577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.591713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.591758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.592984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.593027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.594175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.594224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.595287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.595333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.595654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.595795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.596166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.596208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.596247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.597326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.598391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.598438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.598476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.598813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.599739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.599800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.599839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.600204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.601755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.601810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.601848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.602792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.603042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.603187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.603232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.603590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.603631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.220 [2024-07-25 07:38:25.604930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.604978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.605636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.605685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.605932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.606072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.606126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.606182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.606220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.607351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.607400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.607443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.607479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.607725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.608799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.608854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.608892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.610332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.612928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.612989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.613035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.613399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.613784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.613918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.614889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.614937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.616035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.617243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.618097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.618146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.618498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.618872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.619013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.620451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.620508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.620869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.622212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.622846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.622889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.623801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.624052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.624198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.625210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.625255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.626042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.627204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.628263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.628310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.629516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.629770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.629904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.630309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.630357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.631648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.633108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.633486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.633541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.634978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.635331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.636953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.637332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.637378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.637732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.638820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.640281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.641059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.642601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.642864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.643000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.644402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.645944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.646314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.648901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.649323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.650589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.652003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.652389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.652839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.653213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.653577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.653939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.655519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.655884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.656251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.656613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.656970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.657432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.657797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.658176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.658540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.660324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.221 [2024-07-25 07:38:25.660688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.661059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.661427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.661774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.662231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.662593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.662951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.663323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.664846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.665234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.665282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.665635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.665922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.666384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.666757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.666802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.667159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.668465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.668831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.668876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.669240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.669581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.669719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.670093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.670145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.670500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.671842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.672217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.672261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.672619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.672932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.673076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.673452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.673499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.673853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.675241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.675607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.675649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.676005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.676339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.676477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.676843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.676888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.677248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.678617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.678983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.679024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.679390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.679740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.679879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.680253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.680296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.680649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.681978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.682356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.682400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.682758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.683053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.683201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.683563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.683604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.683969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.685147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.686323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.686366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.686727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.687163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.687305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.688607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.688654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.689922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.691136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.691667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.691710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.692063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.692343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.692482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.693594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.693641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.694699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.695785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.696163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.222 [2024-07-25 07:38:25.696208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.696565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.696817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.696954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.698264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.698311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.698744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.699885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.700259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.700301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.701075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.701398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.701537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.702575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.702620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.703849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.705092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.705470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.705525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.706714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.707045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.707191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.708463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.708878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.708923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.710115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.711620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.712310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.712357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.712645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.714090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.714464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.714513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.714870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.717113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.718409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.718454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.719771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.720061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.720798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.720865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.721233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.721972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.724384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.724444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.725921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.726290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.726584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.726719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.727131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.728549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.728594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.729673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.730804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.731170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.731227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.731679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.733309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.733680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.733733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.735083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.737805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.738598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.738661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.739689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.740011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.740471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.740524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.741043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.742216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.743691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.743746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.745107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.746459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.746793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.746929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.223 [2024-07-25 07:38:25.748174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.749270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.749326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.750765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.752172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.753481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.753524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.753776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.754698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.755908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.755954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.757174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.758664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.760071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.760129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.761620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.761885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.763220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.763272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.764144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.765345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.766723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.766778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.767136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.768437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.768724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.768857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.770079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.771307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.771350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.772426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.773918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.775470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.775515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.775916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.776376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.777887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.777939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.779498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.781914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.783326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.783374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.784726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.784976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.785437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.785808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.787211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.788494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.791123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.792648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.794059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.795393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.795722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.795863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.796235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.797529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.797572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.486 [2024-07-25 07:38:25.798719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.799561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.800831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.800874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.801120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.802716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.802780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.804145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.804187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.806696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.806749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.807958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.808001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.808293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.809196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.809247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.810451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.810494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.811866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.811920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.812285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.812332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.812582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.813888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.813939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.815145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.815187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.817818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.817873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.819308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.819355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.819605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.820057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.820109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.820491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.820536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.821689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.821736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.822958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.823001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.823287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.823424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.823467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.824689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.824744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.826180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.826242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.826283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.826321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.826663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.827953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.828005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.828043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.828080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.829792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.830843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.830894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.830932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.830986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.831432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.831576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.831620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.831659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.831696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.832749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.832803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.832841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.832878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.833135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.833281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.833324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.833365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.833402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.834469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.834516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.835973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.836029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.836281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.836419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.836463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.836834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.836886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.487 [2024-07-25 07:38:25.839297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.839350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.840566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.840607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.840921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.842224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.842275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.843498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.843540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.844975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.845030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.846420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.846470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.846720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.848045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.848096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.849456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.849501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.851933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.851989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.853266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.853307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.853713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.854180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.854235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.855658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.855707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.857555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.857610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.858784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.858827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.859113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.860446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.860499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.861551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.861603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.864386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.864439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.865796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.865840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.866093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.866973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.867024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.868227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.868268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.869647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.869701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.870071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.870116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.870371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.871862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.871926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.873470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.873513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.875801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.875853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.877076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.877120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.877373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.877829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.877889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.878254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.878297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.880547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.880601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.881359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.881401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.881698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.883023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.883075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.884299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.884340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.887401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.887461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.888971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.889012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.889265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.890777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.890845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.891948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.891990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.894223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.894276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.894628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.894668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.894973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.896260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.896312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.896353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.897563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.488 [2024-07-25 07:38:25.899792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.899845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.899887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.901092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.901386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.901524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.901566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.901996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.902040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.903150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.903209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.904716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.904759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.905047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.905194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.906072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.906116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.906159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.908100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.909613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.909664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.909712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.909977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.910571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.910622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.910661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.911563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.913806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.913861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.913899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.914785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.915079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.915225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.915272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.916692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.916746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.917869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.917920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.918814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.918860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.919214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.919350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.920295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.920350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.920388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.921587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.921956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.922000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.922038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.922344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.923653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.923704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.923746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.924655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.926162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.926217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.926270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.926623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.926872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.927010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.927067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.928540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.928601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.929726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.929795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.930168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.930215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.930569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.930707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.932105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.932162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.932200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.933372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.934360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.934407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.934445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.934750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.935206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.935257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.935297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.935769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.937221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.937277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.937320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.937678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.937929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.938062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.938105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.939326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.939368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.489 [2024-07-25 07:38:25.940555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.940610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.941937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.941980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.942230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.942368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.942413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.942455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.942509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.944001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.944049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.944089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.944127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.944416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.945111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.945174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.945215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.946650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.948494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.948548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.948586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.949786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.950091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.950236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.951463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.951507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.952277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.953359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.954815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.954882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.955245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.955616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.955754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.956118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.956169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.957577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.958715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.960081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.960126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.960486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.960859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.960994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.962333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.962377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.962735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.963916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.964293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.964341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.964698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.965036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.965177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.965544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.965588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.965959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.967575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.967946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.967992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.968362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.968705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.969164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.969532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.969591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.969949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.971368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.971736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.972097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.972466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.972853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.972988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.973358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.973720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.974079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.975657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.976027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.976396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.976759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.977115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.977570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.977948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.978317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.978679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.490 [2024-07-25 07:38:25.980177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.980546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.980907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.981274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.981648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.982098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.982475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.982838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.983207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.984781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.985159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.985537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.985898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.986184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.986635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.987008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.987379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.987747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.989417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.989792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.989838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.990201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.990555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.991007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.991387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.991441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.991798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.993022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.993402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.993456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.993816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.994151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.994289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.994697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.994744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.995858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.997015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.998260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.998304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.998671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.999109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:25.999253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.000450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.000497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.001649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.002893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.003364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.003413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.003767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.004032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.004177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.005304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.005351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.006516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.007648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.008016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.008068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.008440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.008739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.008875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.009995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.010041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.010460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.011682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.012049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.012098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.012599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.012849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.012989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.013560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.013607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.014549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.491 [2024-07-25 07:38:26.015696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.016629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.016677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.017523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.017773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.017908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.018991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.019038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.019457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.020843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.753 [2024-07-25 07:38:26.022255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.022310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.023712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.023990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.024127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.025014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.025060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.026492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.027826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.028998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.029046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.030285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.030605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.030748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.032248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.032310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.033669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.034887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.035841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.035894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.036786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.037073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.037222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.037586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.037634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.037992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.039136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.039964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.040011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.041501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.041754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.041895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.043450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.043817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.043862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.045109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.045497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.045859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.045904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.046171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.047552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.048991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.049037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.050085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.052263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.052650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.052712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.053067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.053482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.053934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.053986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.054353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.054715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.057253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.057313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.058853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.059224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.059517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.059656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.060612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.061739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.061782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.062901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.064384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.065857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.065907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.066169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.067797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.068172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.068220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.068579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.071150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.072010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.072059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.073459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.073712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.075183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.075241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.076640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.076999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.079388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.079442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.080866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.081621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.081872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.082010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.083505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.085043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.085092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.086256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.087442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.754 [2024-07-25 07:38:26.088644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.088686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.088979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.090520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.091131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.091181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.092383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.093780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.094159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.094207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.095457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.095746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.096838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.096962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.098344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.099370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.101525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.101599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.101962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.102325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.102574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.102713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.104213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.105743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.105793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.106990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.108198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.109614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.109656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.109996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.110455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.110867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.110912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.112112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.114061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.115279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.115326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.116550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.116800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.117667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.118039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.118414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.119726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.121458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.122669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.123893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.125322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.125576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.125712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.126079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.126451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.126529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.127676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.129229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.130083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.130128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.130448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.131768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.131820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.133245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.133286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.136201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.136257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.137520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.137563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.137809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.139129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.139185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.140575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.140625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.142289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.142343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.142698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.142742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.142993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.144297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.144349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.145562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.145604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.147945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.148000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.149393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.149443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.149693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.150155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.150210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.150569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.150615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.151812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.151860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.153307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.755 [2024-07-25 07:38:26.153350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.153687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.153827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.153870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.155068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.155110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.156570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.156643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.156693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.156731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.157091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.158475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.158527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.158564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.158601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.159748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.159796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.159835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.159876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.160124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.160272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.160315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.160352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.160389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.161510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.161576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.161630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.161668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.162021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.162170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.162214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.162252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.162292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.163965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.164012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.164050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.165180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.165227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.166652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.166698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.167033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.167180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.167224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.167585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.167629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.170104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.170164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.171210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.171251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.171500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.172864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.172916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.174289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.174339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.176172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.176226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.177426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.177469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.177757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.179297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.179349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.179987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.180029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.182525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.182590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.182950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.182997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.183340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.184442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.184494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.185704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.185746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.188359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.188427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.189792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.189838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.190083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.191518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.191569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.191927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.191972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.194464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.194518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.756 [2024-07-25 07:38:26.195935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.195977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.196315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.197624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.197675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.198887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.198928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.200407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.200462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.201589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.201635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.201927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.202737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.202796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.204352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.204395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.205998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.206058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.207298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.207347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.207596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.208362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.208414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.209544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.209591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.212121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.212180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.213272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.213316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.213663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.215231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.215290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.216780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.216831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.219277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.219337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.220630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.220671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.220957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.222178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.222230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.223095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.223151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.226023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.226078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.226449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.226504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.226753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.228375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.228435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.228478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.228836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.231261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.231316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.231358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.232721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.233048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.233193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.233237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.233728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.233773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.235086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.235134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.236691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.236743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.237021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.237165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.237948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.237992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.238031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.239458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.240415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.240475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.240512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.240835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.242105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.242162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.242208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.242570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.245032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.245086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.245125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.246198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.246449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.246588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.246631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.247940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.247983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.249362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.249410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.249770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.249816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.757 [2024-07-25 07:38:26.250063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.250207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.251472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.251516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.251554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.252731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.253105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.253165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.253204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.253542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.254974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.255028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.255065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.256285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.258546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.258601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.258638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.259840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.260089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.260235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.260280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.260639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.260682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.261841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.261898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.262267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.262319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.262566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.262705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.263070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.263115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.263168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.264479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.265701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.265747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.265786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.266036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.266828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.266879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.266918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.268126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.269575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.269644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.269694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.270053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.270322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.270458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.270509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.271995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.272051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.273245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.273294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.273655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.273699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.274072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.274218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.274262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.274304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.274354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.275772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.275832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.275874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.275912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.276224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.276679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.276736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.276789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.277152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.278754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.278808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.278847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.279211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.279526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.279663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.758 [2024-07-25 07:38:26.280026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.280080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.280766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.281937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.282321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.282380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.282740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.283070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.283216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.283998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.759 [2024-07-25 07:38:26.284045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.285328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.286680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.287733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.287781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.288260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.288511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.288648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.289029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.289070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.289441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.290708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.020 [2024-07-25 07:38:26.292191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.292235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.292592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.292947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.293081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.293459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.293506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.294767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.296447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.296819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.296865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.298166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.298536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.299752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.300116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.300174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.300530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.302076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.303061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.303426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.303787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.304165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.304305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.305412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.306239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.307111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.309722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.310390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.311421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.311781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.312114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.312572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.313902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.314525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.315606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.318528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.318944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.320237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.320595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.320930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.321392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.321775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.323282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.323641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.326205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.326577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.326936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.327302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.327677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.328947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.329418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.329997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.331109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.333525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.333938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.333984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.335429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.335759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.336622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.338185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.338229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.339557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.340961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.341913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.341960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.343157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.343410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.343546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.344464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.344522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.345209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.346794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.348014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.348062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.348650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.348901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.349037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.350483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.350531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.350887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.352085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.353621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.353673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.354959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.355302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.355440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.355803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.355848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.356354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.357559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.358703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.021 [2024-07-25 07:38:26.358751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.359948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.360343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.360479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.360839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.360897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.362333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.363566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.364899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.364948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.365311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.365631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.365767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.366945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.366992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.367913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.369012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.370228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.370275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.371747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.372127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.372268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.372632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.372672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.373629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.374835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.375209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.375269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.375621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.375937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.376075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.377342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.377388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.378922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.380077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.381394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.381437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.381791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.382233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.382371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.382738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.382782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.383151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.384717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.386160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.386204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.386585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.386858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.386994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.388553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.388604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.388959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.390225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.391488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.391532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.391889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.392346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.392483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.392847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.393953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.393998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.395398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.395765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.396471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.396516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.396806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.398328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.398693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.398755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.399111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.401825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.403345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.403389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.403743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.404193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.405730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.405788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.407336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.408779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.411113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.411174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.412462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.413564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.414003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.414156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.414520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.415891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.415939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.022 [2024-07-25 07:38:26.417059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.418085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.419296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.419338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.419631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.420950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.421849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.421896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.422255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.424774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.426239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.426283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.427602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.427899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.429225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.429276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.430495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.431050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.433541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.433594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.434818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.436062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.436317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.436453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.437802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.439068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.439111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.440326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.441444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.442647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.442689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.443002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.444321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.445106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.445157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.446359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.447770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.448136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.448187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.449504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.449790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.451116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.451172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.452385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.453208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.455701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.455774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.456132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.456498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.456748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.456886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.458297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.459593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.459635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.460804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.462032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.463257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.463300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.463550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.464001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.464420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.464466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.465653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.467599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.468811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.468855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.470076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.470362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.471441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.471808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.472350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.473550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.475457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.476672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.477873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.479080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.479334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.479481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.479846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.480319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.480363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.481471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.482923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.484135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.484182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.484499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.485823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.485873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.487100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.487147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.489488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.023 [2024-07-25 07:38:26.489541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.490757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.490801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.491090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.492003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.492054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.493267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.493309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.494840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.494891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.495252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.495293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.495544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.496845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.496896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.498106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.498152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.500685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.500743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.501973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.502016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.502270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.502720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.502768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.503120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.503168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.505497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.505545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.506506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.506549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.506840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.506977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.507019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.508240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.508284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.511050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.511102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.511145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.511182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.511479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.512803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.512853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.512891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.512928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.515457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.515513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.515557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.515594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.515844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.515980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.516035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.516087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.516126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.517614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.517667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.517704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.517742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.517997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.518133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.518184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.518222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.518259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.519969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.520007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.521559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.521607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.522264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.522307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.522608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.522746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.522790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.524007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.524053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.526420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.526476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.527893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.527942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.528196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.528647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.528699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.529053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.529092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.531063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.531118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.024 [2024-07-25 07:38:26.532609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.532650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.532900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.534455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.534513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.534875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.534918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.537364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.537424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.538627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.538669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.538994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.539977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.540030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.540390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.540432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.542641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.542696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.543597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.543646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.543969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.544427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.544481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.544917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.544958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.547464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.547524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.549083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.549123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.549492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.549944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.549993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.551204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.025 [2024-07-25 07:38:26.551247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.553283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.553337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.554097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.554147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.554535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.555044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.555093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.556291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.556334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.558567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.558621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.559535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.559580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.559831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.560294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.560349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.560706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.560750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.563347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.563407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.564653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.564696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.565040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.565501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.565553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.566147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.566192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.567682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.567736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.568091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.568131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.568420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.569438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.569491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.286 [2024-07-25 07:38:26.570329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.570381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.571827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.571881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.572472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.572515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.572819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.574069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.574145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.574191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.575368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.576961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.577021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.577065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.578447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.578765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.578905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.578959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.580412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.580463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.581125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.581185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.582464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.582507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.582756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.582892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.583280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.583324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.583362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.584507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.586000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.586059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.586109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.586363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.587663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.587715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.587753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.588955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.591796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.591850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.591889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.593110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.593442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.593582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.593631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.594983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.595027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.596168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.596218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.597646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.597694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.597943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.598081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.599551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.599603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.599648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.600716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.601947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.601992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.602030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.602326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.603411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.603462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.603501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.603853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.605792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.605847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.605896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.606814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.607067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.607215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.607259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.607664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.607708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.608813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.608876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.609529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.609575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.609873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.610013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.610384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.610427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.610466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.611798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.612181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.612228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.612267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.612719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.613178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.287 [2024-07-25 07:38:26.613230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.613270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.613628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.615250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.615312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.615365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.615721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.616091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.616243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.616288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.616647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.616692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.617903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.617952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.618970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.620515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.620564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.620621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.620659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.621059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.621518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.621569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.621607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.621964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.623760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.623821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.623861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.624234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.624563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.624699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.625059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.625099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.625469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.626801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.627175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.627229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.627586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.627949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.628094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.628466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.628508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.628863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.630370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.630735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.630786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.631165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.631539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.631684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.632042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.632083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.632450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.633774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.634160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.634208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.634568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.634922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.635058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.635437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.635480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.635836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.637576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.637943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.637987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.638352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.638791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.639258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.640769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.640822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.642106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.643316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.643816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.644180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.644812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.645064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.645215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.646594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.647539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.648450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.650415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.651312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.652455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.653602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.653936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.655093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.288 [2024-07-25 07:38:26.655465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.655822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.657363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.659315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.660072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.660435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.660831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.661080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.662379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.662978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.664170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.665554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.668339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.669738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.670102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.671442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.671709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.672167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.672530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.673271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.674484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.676937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.677314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.677359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.677710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.677960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.679279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.680412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.680458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.681037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.682094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.682519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.682561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.682912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.683260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.683399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.684711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.684754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.685273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.686395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.687619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.687664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.688929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.689267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.689406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.690856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.690917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.692283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.693584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.693956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.694001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.694372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.694731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.694869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.695628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.695672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.696866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.697998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.699221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.699265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.700484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.700781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.700926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.701295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.701338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.701692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.702987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.703360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.703415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.703910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.704166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.704302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.705746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.705791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.706721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.289 [2024-07-25 07:38:26.708873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.289 [2024-07-25 07:38:26.708931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.289 [2024-07-25 07:38:26.709298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.289 [2024-07-25 07:38:26.709718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.289 [2024-07-25 07:38:26.710751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.289 [2024-07-25 07:38:26.710801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.289 [2024-07-25 07:38:26.712030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:54.549 00:32:54.549 Latency(us) 00:32:54.549 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:54.549 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x0 length 0x100 00:32:54.549 crypto_ram : 5.69 44.98 2.81 0.00 0.00 2751673.14 72980.89 2389075.56 00:32:54.549 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x100 length 0x100 00:32:54.549 crypto_ram : 5.68 47.04 2.94 0.00 0.00 2636920.24 13841.20 2321966.69 00:32:54.549 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x0 length 0x100 00:32:54.549 crypto_ram2 : 5.69 45.48 2.84 0.00 0.00 2629010.85 4902.09 2389075.56 00:32:54.549 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x100 length 0x100 00:32:54.549 crypto_ram2 : 5.68 47.72 2.98 0.00 0.00 2518273.18 13264.49 2254857.83 00:32:54.549 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x0 length 0x100 00:32:54.549 crypto_ram3 : 5.55 313.00 19.56 0.00 0.00 368288.58 35651.58 523449.14 00:32:54.549 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x100 length 0x100 00:32:54.549 crypto_ram3 : 5.52 326.35 20.40 0.00 0.00 353662.29 19818.09 526804.58 00:32:54.549 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x0 length 0x100 00:32:54.549 crypto_ram4 : 5.63 328.67 20.54 0.00 0.00 341340.98 5452.60 459695.72 00:32:54.549 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:54.549 Verification LBA range: start 0x100 length 0x100 00:32:54.549 crypto_ram4 : 5.61 343.03 21.44 0.00 0.00 328128.58 3827.30 452984.83 00:32:54.549 =================================================================================================================== 00:32:54.549 Total : 1496.26 93.52 0.00 0.00 634874.57 3827.30 2389075.56 00:32:55.117 00:32:55.117 real 0m8.767s 00:32:55.117 user 0m16.679s 00:32:55.117 sys 0m0.389s 00:32:55.117 07:38:27 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:55.117 07:38:27 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:55.117 ************************************ 00:32:55.117 END TEST bdev_verify_big_io 00:32:55.117 ************************************ 00:32:55.117 07:38:27 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:55.117 07:38:27 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:32:55.117 07:38:27 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:55.117 07:38:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:55.117 ************************************ 00:32:55.117 START TEST bdev_write_zeroes 00:32:55.117 ************************************ 00:32:55.117 07:38:27 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:55.117 [2024-07-25 07:38:27.497115] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:32:55.117 [2024-07-25 07:38:27.497185] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1815643 ] 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:55.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:55.117 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:55.117 [2024-07-25 07:38:27.625942] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:55.377 [2024-07-25 07:38:27.709723] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:32:55.377 [2024-07-25 07:38:27.730964] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:55.377 [2024-07-25 07:38:27.738984] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:55.377 [2024-07-25 07:38:27.747003] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:55.377 [2024-07-25 07:38:27.857104] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:57.913 [2024-07-25 07:38:30.033269] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:57.913 [2024-07-25 07:38:30.033333] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:57.913 [2024-07-25 07:38:30.033348] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.913 [2024-07-25 07:38:30.041289] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:57.913 [2024-07-25 07:38:30.041307] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:57.913 [2024-07-25 07:38:30.041318] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.913 [2024-07-25 07:38:30.049308] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:57.913 [2024-07-25 07:38:30.049326] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:57.913 [2024-07-25 07:38:30.049337] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.913 [2024-07-25 07:38:30.057329] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:57.913 [2024-07-25 07:38:30.057346] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:57.913 [2024-07-25 07:38:30.057356] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.913 Running I/O for 1 seconds... 00:32:58.851 00:32:58.851 Latency(us) 00:32:58.851 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:58.851 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:58.851 crypto_ram : 1.02 2126.77 8.31 0.00 0.00 59765.05 5006.95 71722.60 00:32:58.851 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:58.851 crypto_ram2 : 1.02 2132.52 8.33 0.00 0.00 59295.18 4980.74 66689.43 00:32:58.851 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:58.851 crypto_ram3 : 1.02 16345.03 63.85 0.00 0.00 7715.82 2293.76 9961.47 00:32:58.851 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:58.851 crypto_ram4 : 1.02 16382.42 63.99 0.00 0.00 7676.35 2280.65 8074.04 00:32:58.851 =================================================================================================================== 00:32:58.851 Total : 36986.74 144.48 0.00 0.00 13689.76 2280.65 71722.60 00:32:59.111 00:32:59.111 real 0m4.065s 00:32:59.111 user 0m3.696s 00:32:59.111 sys 0m0.326s 00:32:59.111 07:38:31 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:59.111 07:38:31 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:59.111 ************************************ 00:32:59.111 END TEST bdev_write_zeroes 00:32:59.111 ************************************ 00:32:59.111 07:38:31 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.111 07:38:31 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:32:59.111 07:38:31 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:59.111 07:38:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:59.111 ************************************ 00:32:59.111 START TEST bdev_json_nonenclosed 00:32:59.111 ************************************ 00:32:59.111 07:38:31 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.111 [2024-07-25 07:38:31.640610] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:32:59.111 [2024-07-25 07:38:31.640665] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1816211 ] 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:59.371 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.371 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:59.371 [2024-07-25 07:38:31.772476] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:59.371 [2024-07-25 07:38:31.855147] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:32:59.371 [2024-07-25 07:38:31.855209] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:59.371 [2024-07-25 07:38:31.855224] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:59.371 [2024-07-25 07:38:31.855235] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:59.631 00:32:59.631 real 0m0.359s 00:32:59.631 user 0m0.216s 00:32:59.631 sys 0m0.141s 00:32:59.631 07:38:31 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:59.631 07:38:31 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:59.631 ************************************ 00:32:59.631 END TEST bdev_json_nonenclosed 00:32:59.631 ************************************ 00:32:59.631 07:38:31 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.632 07:38:31 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:32:59.632 07:38:31 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:59.632 07:38:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:59.632 ************************************ 00:32:59.632 START TEST bdev_json_nonarray 00:32:59.632 ************************************ 00:32:59.632 07:38:32 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.632 [2024-07-25 07:38:32.075749] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:32:59.632 [2024-07-25 07:38:32.075802] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1816407 ] 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:59.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:59.632 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:59.903 [2024-07-25 07:38:32.207657] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:59.903 [2024-07-25 07:38:32.289349] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:32:59.903 [2024-07-25 07:38:32.289420] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:59.903 [2024-07-25 07:38:32.289436] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:59.903 [2024-07-25 07:38:32.289448] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:59.903 00:32:59.903 real 0m0.358s 00:32:59.903 user 0m0.214s 00:32:59.903 sys 0m0.142s 00:32:59.903 07:38:32 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:59.903 07:38:32 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:59.903 ************************************ 00:32:59.903 END TEST bdev_json_nonarray 00:32:59.903 ************************************ 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:32:59.903 07:38:32 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:32:59.903 00:32:59.903 real 1m10.330s 00:32:59.903 user 2m52.886s 00:32:59.903 sys 0m8.228s 00:32:59.903 07:38:32 blockdev_crypto_aesni -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:59.903 07:38:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:59.903 ************************************ 00:32:59.903 END TEST blockdev_crypto_aesni 00:32:59.903 ************************************ 00:33:00.182 07:38:32 -- spdk/autotest.sh@362 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:00.182 07:38:32 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:33:00.182 07:38:32 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:00.182 07:38:32 -- common/autotest_common.sh@10 -- # set +x 00:33:00.182 ************************************ 00:33:00.182 START TEST blockdev_crypto_sw 00:33:00.182 ************************************ 00:33:00.182 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:00.182 * Looking for test storage... 00:33:00.182 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1816541 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:00.182 07:38:32 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1816541 00:33:00.182 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # '[' -z 1816541 ']' 00:33:00.182 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:00.182 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:00.182 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:00.183 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:00.183 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:00.183 07:38:32 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:00.183 [2024-07-25 07:38:32.683645] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:00.183 [2024-07-25 07:38:32.683712] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1816541 ] 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.442 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:00.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.443 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:00.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.443 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:00.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.443 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:00.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.443 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:00.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.443 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:00.443 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:00.443 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:00.443 [2024-07-25 07:38:32.816984] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:00.443 [2024-07-25 07:38:32.902849] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@864 -- # return 0 00:33:01.381 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:33:01.381 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:33:01.381 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:01.381 Malloc0 00:33:01.381 Malloc1 00:33:01.381 true 00:33:01.381 true 00:33:01.381 true 00:33:01.381 [2024-07-25 07:38:33.837390] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:01.381 crypto_ram 00:33:01.381 [2024-07-25 07:38:33.845419] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:01.381 crypto_ram2 00:33:01.381 [2024-07-25 07:38:33.853443] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:01.381 crypto_ram3 00:33:01.381 [ 00:33:01.381 { 00:33:01.381 "name": "Malloc1", 00:33:01.381 "aliases": [ 00:33:01.381 "1cc491cf-67da-4758-8b36-b5f7605eed9c" 00:33:01.381 ], 00:33:01.381 "product_name": "Malloc disk", 00:33:01.381 "block_size": 4096, 00:33:01.381 "num_blocks": 4096, 00:33:01.381 "uuid": "1cc491cf-67da-4758-8b36-b5f7605eed9c", 00:33:01.381 "assigned_rate_limits": { 00:33:01.381 "rw_ios_per_sec": 0, 00:33:01.381 "rw_mbytes_per_sec": 0, 00:33:01.381 "r_mbytes_per_sec": 0, 00:33:01.381 "w_mbytes_per_sec": 0 00:33:01.381 }, 00:33:01.381 "claimed": true, 00:33:01.381 "claim_type": "exclusive_write", 00:33:01.381 "zoned": false, 00:33:01.381 "supported_io_types": { 00:33:01.381 "read": true, 00:33:01.381 "write": true, 00:33:01.381 "unmap": true, 00:33:01.381 "flush": true, 00:33:01.381 "reset": true, 00:33:01.381 "nvme_admin": false, 00:33:01.381 "nvme_io": false, 00:33:01.381 "nvme_io_md": false, 00:33:01.381 "write_zeroes": true, 00:33:01.381 "zcopy": true, 00:33:01.381 "get_zone_info": false, 00:33:01.381 "zone_management": false, 00:33:01.381 "zone_append": false, 00:33:01.381 "compare": false, 00:33:01.381 "compare_and_write": false, 00:33:01.381 "abort": true, 00:33:01.381 "seek_hole": false, 00:33:01.381 "seek_data": false, 00:33:01.381 "copy": true, 00:33:01.381 "nvme_iov_md": false 00:33:01.381 }, 00:33:01.381 "memory_domains": [ 00:33:01.381 { 00:33:01.381 "dma_device_id": "system", 00:33:01.381 "dma_device_type": 1 00:33:01.381 }, 00:33:01.381 { 00:33:01.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:01.381 "dma_device_type": 2 00:33:01.381 } 00:33:01.381 ], 00:33:01.381 "driver_specific": {} 00:33:01.381 } 00:33:01.381 ] 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:01.381 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:01.381 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:33:01.381 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:01.381 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:01.641 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:01.641 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:01.641 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:33:01.641 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:01.641 07:38:33 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:33:01.641 07:38:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0411a913-7f78-53e6-850b-ebc1977ed2c1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "0411a913-7f78-53e6-850b-ebc1977ed2c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "48beea06-426c-5b16-8d9b-4d8fbfe63564"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "48beea06-426c-5b16-8d9b-4d8fbfe63564",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:33:01.641 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 1816541 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # '[' -z 1816541 ']' 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # kill -0 1816541 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # uname 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1816541 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1816541' 00:33:01.641 killing process with pid 1816541 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@969 -- # kill 1816541 00:33:01.641 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@974 -- # wait 1816541 00:33:02.211 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:02.211 07:38:34 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:02.211 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:33:02.211 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:02.211 07:38:34 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:02.211 ************************************ 00:33:02.211 START TEST bdev_hello_world 00:33:02.211 ************************************ 00:33:02.211 07:38:34 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:02.211 [2024-07-25 07:38:34.561598] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:02.211 [2024-07-25 07:38:34.561656] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1816827 ] 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:02.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.211 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:02.211 [2024-07-25 07:38:34.691817] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:02.471 [2024-07-25 07:38:34.773313] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:02.471 [2024-07-25 07:38:34.947993] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:02.471 [2024-07-25 07:38:34.948059] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:02.471 [2024-07-25 07:38:34.948073] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.471 [2024-07-25 07:38:34.956011] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:02.471 [2024-07-25 07:38:34.956028] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:02.471 [2024-07-25 07:38:34.956039] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.471 [2024-07-25 07:38:34.964032] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:02.471 [2024-07-25 07:38:34.964048] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:02.471 [2024-07-25 07:38:34.964059] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.471 [2024-07-25 07:38:35.003914] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:02.471 [2024-07-25 07:38:35.003946] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:02.471 [2024-07-25 07:38:35.003963] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:02.730 [2024-07-25 07:38:35.005109] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:02.730 [2024-07-25 07:38:35.005180] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:02.730 [2024-07-25 07:38:35.005195] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:02.730 [2024-07-25 07:38:35.005226] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:02.730 00:33:02.730 [2024-07-25 07:38:35.005243] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:02.730 00:33:02.730 real 0m0.683s 00:33:02.730 user 0m0.456s 00:33:02.730 sys 0m0.207s 00:33:02.730 07:38:35 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:02.730 07:38:35 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:02.730 ************************************ 00:33:02.730 END TEST bdev_hello_world 00:33:02.730 ************************************ 00:33:02.730 07:38:35 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:33:02.730 07:38:35 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:33:02.730 07:38:35 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:02.730 07:38:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:02.990 ************************************ 00:33:02.990 START TEST bdev_bounds 00:33:02.990 ************************************ 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1817028 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1817028' 00:33:02.990 Process bdevio pid: 1817028 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1817028 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1817028 ']' 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:02.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:02.990 07:38:35 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:02.990 [2024-07-25 07:38:35.333310] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:02.990 [2024-07-25 07:38:35.333366] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1817028 ] 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.990 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:02.990 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:02.991 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:02.991 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:02.991 [2024-07-25 07:38:35.462879] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:03.250 [2024-07-25 07:38:35.552029] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:33:03.250 [2024-07-25 07:38:35.552049] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:33:03.250 [2024-07-25 07:38:35.552053] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:03.250 [2024-07-25 07:38:35.721704] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:03.250 [2024-07-25 07:38:35.721758] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:03.250 [2024-07-25 07:38:35.721772] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.250 [2024-07-25 07:38:35.729726] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:03.250 [2024-07-25 07:38:35.729742] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:03.250 [2024-07-25 07:38:35.729753] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.250 [2024-07-25 07:38:35.737749] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:03.250 [2024-07-25 07:38:35.737764] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:03.250 [2024-07-25 07:38:35.737774] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.819 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:03.819 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:33:03.819 07:38:36 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:03.819 I/O targets: 00:33:03.819 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:03.819 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:03.819 00:33:03.819 00:33:03.819 CUnit - A unit testing framework for C - Version 2.1-3 00:33:03.819 http://cunit.sourceforge.net/ 00:33:03.819 00:33:03.819 00:33:03.819 Suite: bdevio tests on: crypto_ram3 00:33:03.819 Test: blockdev write read block ...passed 00:33:03.819 Test: blockdev write zeroes read block ...passed 00:33:03.819 Test: blockdev write zeroes read no split ...passed 00:33:03.819 Test: blockdev write zeroes read split ...passed 00:33:03.819 Test: blockdev write zeroes read split partial ...passed 00:33:03.819 Test: blockdev reset ...passed 00:33:03.819 Test: blockdev write read 8 blocks ...passed 00:33:03.819 Test: blockdev write read size > 128k ...passed 00:33:03.819 Test: blockdev write read invalid size ...passed 00:33:03.819 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:03.819 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:03.819 Test: blockdev write read max offset ...passed 00:33:03.819 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:03.819 Test: blockdev writev readv 8 blocks ...passed 00:33:03.819 Test: blockdev writev readv 30 x 1block ...passed 00:33:03.819 Test: blockdev writev readv block ...passed 00:33:03.819 Test: blockdev writev readv size > 128k ...passed 00:33:03.819 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:03.819 Test: blockdev comparev and writev ...passed 00:33:03.819 Test: blockdev nvme passthru rw ...passed 00:33:03.819 Test: blockdev nvme passthru vendor specific ...passed 00:33:03.819 Test: blockdev nvme admin passthru ...passed 00:33:03.819 Test: blockdev copy ...passed 00:33:03.819 Suite: bdevio tests on: crypto_ram 00:33:03.819 Test: blockdev write read block ...passed 00:33:03.819 Test: blockdev write zeroes read block ...passed 00:33:03.819 Test: blockdev write zeroes read no split ...passed 00:33:04.079 Test: blockdev write zeroes read split ...passed 00:33:04.079 Test: blockdev write zeroes read split partial ...passed 00:33:04.079 Test: blockdev reset ...passed 00:33:04.079 Test: blockdev write read 8 blocks ...passed 00:33:04.079 Test: blockdev write read size > 128k ...passed 00:33:04.079 Test: blockdev write read invalid size ...passed 00:33:04.079 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:04.079 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:04.079 Test: blockdev write read max offset ...passed 00:33:04.079 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:04.079 Test: blockdev writev readv 8 blocks ...passed 00:33:04.079 Test: blockdev writev readv 30 x 1block ...passed 00:33:04.079 Test: blockdev writev readv block ...passed 00:33:04.079 Test: blockdev writev readv size > 128k ...passed 00:33:04.079 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:04.079 Test: blockdev comparev and writev ...passed 00:33:04.079 Test: blockdev nvme passthru rw ...passed 00:33:04.079 Test: blockdev nvme passthru vendor specific ...passed 00:33:04.079 Test: blockdev nvme admin passthru ...passed 00:33:04.079 Test: blockdev copy ...passed 00:33:04.079 00:33:04.079 Run Summary: Type Total Ran Passed Failed Inactive 00:33:04.079 suites 2 2 n/a 0 0 00:33:04.079 tests 46 46 46 0 0 00:33:04.079 asserts 260 260 260 0 n/a 00:33:04.079 00:33:04.079 Elapsed time = 0.078 seconds 00:33:04.079 0 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1817028 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1817028 ']' 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1817028 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1817028 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1817028' 00:33:04.079 killing process with pid 1817028 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1817028 00:33:04.079 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1817028 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:33:04.339 00:33:04.339 real 0m1.364s 00:33:04.339 user 0m3.520s 00:33:04.339 sys 0m0.390s 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:04.339 ************************************ 00:33:04.339 END TEST bdev_bounds 00:33:04.339 ************************************ 00:33:04.339 07:38:36 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:04.339 07:38:36 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:33:04.339 07:38:36 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:04.339 07:38:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:04.339 ************************************ 00:33:04.339 START TEST bdev_nbd 00:33:04.339 ************************************ 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1817258 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1817258 /var/tmp/spdk-nbd.sock 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1817258 ']' 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:04.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:04.339 07:38:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:04.339 [2024-07-25 07:38:36.793631] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:04.339 [2024-07-25 07:38:36.793689] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:04.339 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.339 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:04.598 [2024-07-25 07:38:36.928075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:04.598 [2024-07-25 07:38:37.010685] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:04.857 [2024-07-25 07:38:37.177655] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:04.857 [2024-07-25 07:38:37.177720] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:04.857 [2024-07-25 07:38:37.177734] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:04.857 [2024-07-25 07:38:37.185674] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:04.857 [2024-07-25 07:38:37.185691] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:04.857 [2024-07-25 07:38:37.185702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:04.857 [2024-07-25 07:38:37.193693] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:04.857 [2024-07-25 07:38:37.193713] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:04.857 [2024-07-25 07:38:37.193724] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:05.424 1+0 records in 00:33:05.424 1+0 records out 00:33:05.424 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268505 s, 15.3 MB/s 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:05.424 07:38:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:05.683 1+0 records in 00:33:05.683 1+0 records out 00:33:05.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314208 s, 13.0 MB/s 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:05.683 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:05.941 { 00:33:05.941 "nbd_device": "/dev/nbd0", 00:33:05.941 "bdev_name": "crypto_ram" 00:33:05.941 }, 00:33:05.941 { 00:33:05.941 "nbd_device": "/dev/nbd1", 00:33:05.941 "bdev_name": "crypto_ram3" 00:33:05.941 } 00:33:05.941 ]' 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:05.941 { 00:33:05.941 "nbd_device": "/dev/nbd0", 00:33:05.941 "bdev_name": "crypto_ram" 00:33:05.941 }, 00:33:05.941 { 00:33:05.941 "nbd_device": "/dev/nbd1", 00:33:05.941 "bdev_name": "crypto_ram3" 00:33:05.941 } 00:33:05.941 ]' 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:05.941 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:06.200 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:06.458 07:38:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:06.717 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:06.976 /dev/nbd0 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:06.976 1+0 records in 00:33:06.976 1+0 records out 00:33:06.976 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261246 s, 15.7 MB/s 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:06.976 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:07.235 /dev/nbd1 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.235 1+0 records in 00:33:07.235 1+0 records out 00:33:07.235 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344117 s, 11.9 MB/s 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:07.235 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:07.494 { 00:33:07.494 "nbd_device": "/dev/nbd0", 00:33:07.494 "bdev_name": "crypto_ram" 00:33:07.494 }, 00:33:07.494 { 00:33:07.494 "nbd_device": "/dev/nbd1", 00:33:07.494 "bdev_name": "crypto_ram3" 00:33:07.494 } 00:33:07.494 ]' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:07.494 { 00:33:07.494 "nbd_device": "/dev/nbd0", 00:33:07.494 "bdev_name": "crypto_ram" 00:33:07.494 }, 00:33:07.494 { 00:33:07.494 "nbd_device": "/dev/nbd1", 00:33:07.494 "bdev_name": "crypto_ram3" 00:33:07.494 } 00:33:07.494 ]' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:07.494 /dev/nbd1' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:07.494 /dev/nbd1' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:07.494 256+0 records in 00:33:07.494 256+0 records out 00:33:07.494 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114498 s, 91.6 MB/s 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:07.494 256+0 records in 00:33:07.494 256+0 records out 00:33:07.494 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0183116 s, 57.3 MB/s 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:07.494 07:38:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:07.494 256+0 records in 00:33:07.494 256+0 records out 00:33:07.494 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0425989 s, 24.6 MB/s 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:07.494 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:07.753 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:07.753 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:07.753 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:07.754 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:08.013 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:08.271 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:08.530 07:38:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:08.530 malloc_lvol_verify 00:33:08.530 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:08.789 7b7c8541-77bf-4930-ba26-57ebcd576223 00:33:08.789 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:09.047 6d4dcb13-8313-4498-965c-71a3a03a889c 00:33:09.047 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:09.306 /dev/nbd0 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:09.307 mke2fs 1.46.5 (30-Dec-2021) 00:33:09.307 Discarding device blocks: 0/4096 done 00:33:09.307 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:09.307 00:33:09.307 Allocating group tables: 0/1 done 00:33:09.307 Writing inode tables: 0/1 done 00:33:09.307 Creating journal (1024 blocks): done 00:33:09.307 Writing superblocks and filesystem accounting information: 0/1 done 00:33:09.307 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:09.307 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1817258 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1817258 ']' 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1817258 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:09.566 07:38:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1817258 00:33:09.566 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:09.566 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:09.566 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1817258' 00:33:09.566 killing process with pid 1817258 00:33:09.566 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1817258 00:33:09.566 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1817258 00:33:09.825 07:38:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:33:09.825 00:33:09.825 real 0m5.485s 00:33:09.825 user 0m7.766s 00:33:09.825 sys 0m2.206s 00:33:09.825 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:09.825 07:38:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:09.825 ************************************ 00:33:09.825 END TEST bdev_nbd 00:33:09.825 ************************************ 00:33:09.826 07:38:42 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:33:09.826 07:38:42 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:33:09.826 07:38:42 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:33:09.826 07:38:42 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:33:09.826 07:38:42 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:33:09.826 07:38:42 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:09.826 07:38:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:09.826 ************************************ 00:33:09.826 START TEST bdev_fio 00:33:09.826 ************************************ 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:09.826 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:09.826 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:10.085 ************************************ 00:33:10.085 START TEST bdev_fio_rw_verify 00:33:10.085 ************************************ 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:10.085 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:10.086 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:10.086 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:10.086 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:10.086 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:10.086 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:10.086 07:38:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:10.344 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:10.344 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:10.344 fio-3.35 00:33:10.344 Starting 2 threads 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:10.603 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.603 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:10.604 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:10.604 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:22.881 00:33:22.881 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1818590: Thu Jul 25 07:38:53 2024 00:33:22.881 read: IOPS=23.5k, BW=91.8MiB/s (96.3MB/s)(918MiB/10001msec) 00:33:22.881 slat (nsec): min=13260, max=88968, avg=18734.20, stdev=3399.94 00:33:22.881 clat (usec): min=6, max=459, avg=136.08, stdev=54.00 00:33:22.881 lat (usec): min=24, max=496, avg=154.81, stdev=55.32 00:33:22.881 clat percentiles (usec): 00:33:22.881 | 50.000th=[ 133], 99.000th=[ 260], 99.900th=[ 277], 99.990th=[ 322], 00:33:22.881 | 99.999th=[ 404] 00:33:22.881 write: IOPS=28.2k, BW=110MiB/s (116MB/s)(1046MiB/9487msec); 0 zone resets 00:33:22.881 slat (usec): min=13, max=468, avg=31.25, stdev= 4.13 00:33:22.881 clat (usec): min=12, max=1847, avg=181.75, stdev=83.11 00:33:22.881 lat (usec): min=41, max=1880, avg=213.00, stdev=84.56 00:33:22.881 clat percentiles (usec): 00:33:22.881 | 50.000th=[ 176], 99.000th=[ 359], 99.900th=[ 379], 99.990th=[ 635], 00:33:22.881 | 99.999th=[ 1778] 00:33:22.881 bw ( KiB/s): min=99360, max=113368, per=94.79%, avg=106975.16, stdev=2082.02, samples=38 00:33:22.881 iops : min=24840, max=28342, avg=26743.79, stdev=520.50, samples=38 00:33:22.881 lat (usec) : 10=0.01%, 20=0.01%, 50=5.29%, 100=18.07%, 250=64.03% 00:33:22.881 lat (usec) : 500=12.59%, 750=0.01%, 1000=0.01% 00:33:22.881 lat (msec) : 2=0.01% 00:33:22.881 cpu : usr=99.64%, sys=0.00%, ctx=30, majf=0, minf=467 00:33:22.881 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:22.881 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:22.881 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:22.881 issued rwts: total=235102,267659,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:22.881 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:22.881 00:33:22.881 Run status group 0 (all jobs): 00:33:22.881 READ: bw=91.8MiB/s (96.3MB/s), 91.8MiB/s-91.8MiB/s (96.3MB/s-96.3MB/s), io=918MiB (963MB), run=10001-10001msec 00:33:22.881 WRITE: bw=110MiB/s (116MB/s), 110MiB/s-110MiB/s (116MB/s-116MB/s), io=1046MiB (1096MB), run=9487-9487msec 00:33:22.881 00:33:22.881 real 0m11.138s 00:33:22.881 user 0m31.574s 00:33:22.881 sys 0m0.394s 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:22.881 ************************************ 00:33:22.881 END TEST bdev_fio_rw_verify 00:33:22.881 ************************************ 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0411a913-7f78-53e6-850b-ebc1977ed2c1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "0411a913-7f78-53e6-850b-ebc1977ed2c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "48beea06-426c-5b16-8d9b-4d8fbfe63564"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "48beea06-426c-5b16-8d9b-4d8fbfe63564",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:33:22.881 crypto_ram3 ]] 00:33:22.881 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0411a913-7f78-53e6-850b-ebc1977ed2c1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "0411a913-7f78-53e6-850b-ebc1977ed2c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "48beea06-426c-5b16-8d9b-4d8fbfe63564"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "48beea06-426c-5b16-8d9b-4d8fbfe63564",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:22.882 ************************************ 00:33:22.882 START TEST bdev_fio_trim 00:33:22.882 ************************************ 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:22.882 07:38:53 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:22.882 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:22.882 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:22.882 fio-3.35 00:33:22.882 Starting 2 threads 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.882 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:22.882 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.883 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:22.883 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.883 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:22.883 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.883 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:22.883 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:22.883 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:32.862 00:33:32.862 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1820546: Thu Jul 25 07:39:04 2024 00:33:32.862 write: IOPS=29.9k, BW=117MiB/s (122MB/s)(1167MiB/10001msec); 0 zone resets 00:33:32.862 slat (usec): min=11, max=1776, avg=29.16, stdev= 7.45 00:33:32.862 clat (usec): min=34, max=2060, avg=218.81, stdev=98.68 00:33:32.862 lat (usec): min=46, max=2090, avg=247.97, stdev=100.64 00:33:32.862 clat percentiles (usec): 00:33:32.862 | 50.000th=[ 219], 99.000th=[ 445], 99.900th=[ 494], 99.990th=[ 758], 00:33:32.862 | 99.999th=[ 971] 00:33:32.862 bw ( KiB/s): min=100328, max=131784, per=100.00%, avg=119536.00, stdev=5234.04, samples=38 00:33:32.862 iops : min=25082, max=32946, avg=29884.00, stdev=1308.51, samples=38 00:33:32.862 trim: IOPS=29.9k, BW=117MiB/s (122MB/s)(1167MiB/10001msec); 0 zone resets 00:33:32.862 slat (usec): min=4, max=229, avg=14.54, stdev= 4.93 00:33:32.862 clat (usec): min=27, max=2090, avg=144.82, stdev=58.49 00:33:32.862 lat (usec): min=32, max=2105, avg=159.36, stdev=59.44 00:33:32.862 clat percentiles (usec): 00:33:32.862 | 50.000th=[ 137], 99.000th=[ 285], 99.900th=[ 326], 99.990th=[ 445], 00:33:32.862 | 99.999th=[ 668] 00:33:32.862 bw ( KiB/s): min=100352, max=131792, per=100.00%, avg=119537.26, stdev=5233.10, samples=38 00:33:32.862 iops : min=25088, max=32948, avg=29884.32, stdev=1308.27, samples=38 00:33:32.862 lat (usec) : 50=0.99%, 100=18.62%, 250=58.13%, 500=22.22%, 750=0.04% 00:33:32.862 lat (usec) : 1000=0.01% 00:33:32.862 lat (msec) : 4=0.01% 00:33:32.862 cpu : usr=99.62%, sys=0.00%, ctx=92, majf=0, minf=341 00:33:32.862 IO depths : 1=6.6%, 2=16.1%, 4=61.8%, 8=15.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:32.862 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:32.862 complete : 0=0.0%, 4=86.6%, 8=13.4%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:32.862 issued rwts: total=0,298656,298656,0 short=0,0,0,0 dropped=0,0,0,0 00:33:32.862 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:32.862 00:33:32.862 Run status group 0 (all jobs): 00:33:32.862 WRITE: bw=117MiB/s (122MB/s), 117MiB/s-117MiB/s (122MB/s-122MB/s), io=1167MiB (1223MB), run=10001-10001msec 00:33:32.862 TRIM: bw=117MiB/s (122MB/s), 117MiB/s-117MiB/s (122MB/s-122MB/s), io=1167MiB (1223MB), run=10001-10001msec 00:33:32.862 00:33:32.862 real 0m11.157s 00:33:32.862 user 0m31.970s 00:33:32.862 sys 0m0.371s 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:32.862 ************************************ 00:33:32.862 END TEST bdev_fio_trim 00:33:32.862 ************************************ 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:33:32.862 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:33:32.862 00:33:32.862 real 0m22.645s 00:33:32.862 user 1m3.735s 00:33:32.862 sys 0m0.945s 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:32.862 07:39:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:32.862 ************************************ 00:33:32.862 END TEST bdev_fio 00:33:32.862 ************************************ 00:33:32.862 07:39:04 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:32.862 07:39:04 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:32.862 07:39:04 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:33:32.862 07:39:04 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:32.862 07:39:04 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:32.862 ************************************ 00:33:32.862 START TEST bdev_verify 00:33:32.862 ************************************ 00:33:32.862 07:39:05 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:32.862 [2024-07-25 07:39:05.053126] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:32.862 [2024-07-25 07:39:05.053186] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1822756 ] 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.862 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:32.862 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:32.863 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:32.863 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:32.863 [2024-07-25 07:39:05.185254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:32.863 [2024-07-25 07:39:05.268992] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:33:32.863 [2024-07-25 07:39:05.268998] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:33.121 [2024-07-25 07:39:05.427266] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:33.121 [2024-07-25 07:39:05.427329] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:33.121 [2024-07-25 07:39:05.427343] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.121 [2024-07-25 07:39:05.435288] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:33.121 [2024-07-25 07:39:05.435305] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:33.121 [2024-07-25 07:39:05.435316] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.121 [2024-07-25 07:39:05.443311] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:33.121 [2024-07-25 07:39:05.443327] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:33.121 [2024-07-25 07:39:05.443338] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:33.121 Running I/O for 5 seconds... 00:33:38.393 00:33:38.393 Latency(us) 00:33:38.393 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:38.393 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:38.393 Verification LBA range: start 0x0 length 0x800 00:33:38.393 crypto_ram : 5.02 6124.46 23.92 0.00 0.00 20811.96 1454.90 26633.83 00:33:38.393 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:38.393 Verification LBA range: start 0x800 length 0x800 00:33:38.393 crypto_ram : 5.01 6126.48 23.93 0.00 0.00 20807.12 1671.17 26528.97 00:33:38.393 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:38.393 Verification LBA range: start 0x0 length 0x800 00:33:38.393 crypto_ram3 : 5.04 3076.02 12.02 0.00 0.00 41365.14 1743.26 30408.70 00:33:38.393 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:38.393 Verification LBA range: start 0x800 length 0x800 00:33:38.393 crypto_ram3 : 5.03 3076.99 12.02 0.00 0.00 41346.33 2070.94 30408.70 00:33:38.393 =================================================================================================================== 00:33:38.393 Total : 18403.95 71.89 0.00 0.00 27696.21 1454.90 30408.70 00:33:38.393 00:33:38.393 real 0m5.734s 00:33:38.393 user 0m10.833s 00:33:38.393 sys 0m0.228s 00:33:38.393 07:39:10 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:38.393 07:39:10 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:38.393 ************************************ 00:33:38.393 END TEST bdev_verify 00:33:38.393 ************************************ 00:33:38.393 07:39:10 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:38.393 07:39:10 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:33:38.393 07:39:10 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:38.393 07:39:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:38.393 ************************************ 00:33:38.393 START TEST bdev_verify_big_io 00:33:38.393 ************************************ 00:33:38.393 07:39:10 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:38.393 [2024-07-25 07:39:10.862032] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:38.393 [2024-07-25 07:39:10.862084] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1823822 ] 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:38.653 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:38.653 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:38.653 [2024-07-25 07:39:10.991751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:38.653 [2024-07-25 07:39:11.075695] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:33:38.653 [2024-07-25 07:39:11.075700] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:38.912 [2024-07-25 07:39:11.233888] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:38.912 [2024-07-25 07:39:11.233941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:38.912 [2024-07-25 07:39:11.233961] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.912 [2024-07-25 07:39:11.241912] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:38.912 [2024-07-25 07:39:11.241929] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:38.912 [2024-07-25 07:39:11.241940] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.912 [2024-07-25 07:39:11.249938] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:38.913 [2024-07-25 07:39:11.249954] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:38.913 [2024-07-25 07:39:11.249964] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.913 Running I/O for 5 seconds... 00:33:44.183 00:33:44.183 Latency(us) 00:33:44.183 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.183 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:44.183 Verification LBA range: start 0x0 length 0x80 00:33:44.183 crypto_ram : 5.07 429.25 26.83 0.00 0.00 291072.08 6160.38 382520.52 00:33:44.183 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:44.183 Verification LBA range: start 0x80 length 0x80 00:33:44.183 crypto_ram : 5.07 429.03 26.81 0.00 0.00 291171.02 5793.38 382520.52 00:33:44.183 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:44.183 Verification LBA range: start 0x0 length 0x80 00:33:44.184 crypto_ram3 : 5.27 242.76 15.17 0.00 0.00 496037.93 5583.67 399297.74 00:33:44.184 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:44.184 Verification LBA range: start 0x80 length 0x80 00:33:44.184 crypto_ram3 : 5.27 242.66 15.17 0.00 0.00 496005.41 5321.52 399297.74 00:33:44.184 =================================================================================================================== 00:33:44.184 Total : 1343.71 83.98 0.00 0.00 367010.48 5321.52 399297.74 00:33:44.442 00:33:44.442 real 0m5.980s 00:33:44.442 user 0m11.333s 00:33:44.442 sys 0m0.222s 00:33:44.442 07:39:16 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:44.442 07:39:16 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:44.442 ************************************ 00:33:44.442 END TEST bdev_verify_big_io 00:33:44.442 ************************************ 00:33:44.442 07:39:16 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:44.442 07:39:16 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:44.442 07:39:16 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:44.442 07:39:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:44.442 ************************************ 00:33:44.442 START TEST bdev_write_zeroes 00:33:44.442 ************************************ 00:33:44.442 07:39:16 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:44.442 [2024-07-25 07:39:16.906756] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:44.442 [2024-07-25 07:39:16.906795] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1824744 ] 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:44.442 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:44.442 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:44.700 [2024-07-25 07:39:17.025190] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:44.700 [2024-07-25 07:39:17.108514] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:44.959 [2024-07-25 07:39:17.279119] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:44.959 [2024-07-25 07:39:17.279187] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:44.959 [2024-07-25 07:39:17.279201] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.959 [2024-07-25 07:39:17.287145] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:44.959 [2024-07-25 07:39:17.287163] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:44.959 [2024-07-25 07:39:17.287174] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.959 [2024-07-25 07:39:17.295166] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:44.959 [2024-07-25 07:39:17.295182] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:44.959 [2024-07-25 07:39:17.295193] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.959 Running I/O for 1 seconds... 00:33:45.902 00:33:45.902 Latency(us) 00:33:45.902 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:45.902 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:45.902 crypto_ram : 1.01 28653.44 111.93 0.00 0.00 4458.61 1199.31 6212.81 00:33:45.902 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:45.902 crypto_ram3 : 1.01 14299.95 55.86 0.00 0.00 8899.84 5531.24 9227.47 00:33:45.902 =================================================================================================================== 00:33:45.902 Total : 42953.39 167.79 0.00 0.00 5939.02 1199.31 9227.47 00:33:46.161 00:33:46.161 real 0m1.676s 00:33:46.161 user 0m1.457s 00:33:46.161 sys 0m0.198s 00:33:46.161 07:39:18 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:46.161 07:39:18 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:46.161 ************************************ 00:33:46.161 END TEST bdev_write_zeroes 00:33:46.161 ************************************ 00:33:46.161 07:39:18 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.161 07:39:18 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:46.161 07:39:18 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:46.161 07:39:18 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:46.161 ************************************ 00:33:46.161 START TEST bdev_json_nonenclosed 00:33:46.161 ************************************ 00:33:46.161 07:39:18 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.161 [2024-07-25 07:39:18.664252] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:46.161 [2024-07-25 07:39:18.664306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1825024 ] 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:46.420 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.420 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:46.420 [2024-07-25 07:39:18.794887] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:46.420 [2024-07-25 07:39:18.878399] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.420 [2024-07-25 07:39:18.878465] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:46.420 [2024-07-25 07:39:18.878481] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:46.420 [2024-07-25 07:39:18.878492] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:46.679 00:33:46.679 real 0m0.357s 00:33:46.679 user 0m0.216s 00:33:46.679 sys 0m0.138s 00:33:46.679 07:39:18 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:46.679 07:39:18 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:46.679 ************************************ 00:33:46.679 END TEST bdev_json_nonenclosed 00:33:46.679 ************************************ 00:33:46.679 07:39:19 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.679 07:39:19 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:46.679 07:39:19 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:46.679 07:39:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:46.679 ************************************ 00:33:46.679 START TEST bdev_json_nonarray 00:33:46.679 ************************************ 00:33:46.679 07:39:19 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.679 [2024-07-25 07:39:19.087770] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:46.679 [2024-07-25 07:39:19.087822] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1825198 ] 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:46.679 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:46.679 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:46.940 [2024-07-25 07:39:19.218872] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:46.940 [2024-07-25 07:39:19.301984] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.940 [2024-07-25 07:39:19.302055] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:46.940 [2024-07-25 07:39:19.302071] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:46.940 [2024-07-25 07:39:19.302082] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:46.940 00:33:46.940 real 0m0.358s 00:33:46.940 user 0m0.203s 00:33:46.940 sys 0m0.153s 00:33:46.940 07:39:19 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:46.940 07:39:19 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:46.940 ************************************ 00:33:46.940 END TEST bdev_json_nonarray 00:33:46.940 ************************************ 00:33:46.940 07:39:19 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:33:46.940 07:39:19 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:33:46.940 07:39:19 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:33:46.940 07:39:19 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:46.940 07:39:19 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:33:46.940 07:39:19 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:46.940 07:39:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:46.941 ************************************ 00:33:46.941 START TEST bdev_crypto_enomem 00:33:46.941 ************************************ 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # bdev_crypto_enomem 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=1825318 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 1825318 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # '[' -z 1825318 ']' 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:46.941 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:46.941 07:39:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:47.200 [2024-07-25 07:39:19.509637] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:47.200 [2024-07-25 07:39:19.509692] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1825318 ] 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:47.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:47.200 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:47.200 [2024-07-25 07:39:19.629478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:47.200 [2024-07-25 07:39:19.714560] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@864 -- # return 0 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:48.135 true 00:33:48.135 base0 00:33:48.135 true 00:33:48.135 [2024-07-25 07:39:20.435345] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:48.135 crypt0 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_name=crypt0 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # local i 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.135 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:48.135 [ 00:33:48.135 { 00:33:48.135 "name": "crypt0", 00:33:48.135 "aliases": [ 00:33:48.135 "45b7500f-0a7b-5273-a0bb-68000875b21b" 00:33:48.135 ], 00:33:48.135 "product_name": "crypto", 00:33:48.135 "block_size": 512, 00:33:48.135 "num_blocks": 2097152, 00:33:48.135 "uuid": "45b7500f-0a7b-5273-a0bb-68000875b21b", 00:33:48.135 "assigned_rate_limits": { 00:33:48.135 "rw_ios_per_sec": 0, 00:33:48.135 "rw_mbytes_per_sec": 0, 00:33:48.135 "r_mbytes_per_sec": 0, 00:33:48.135 "w_mbytes_per_sec": 0 00:33:48.135 }, 00:33:48.135 "claimed": false, 00:33:48.135 "zoned": false, 00:33:48.135 "supported_io_types": { 00:33:48.135 "read": true, 00:33:48.135 "write": true, 00:33:48.135 "unmap": false, 00:33:48.135 "flush": false, 00:33:48.136 "reset": true, 00:33:48.136 "nvme_admin": false, 00:33:48.136 "nvme_io": false, 00:33:48.136 "nvme_io_md": false, 00:33:48.136 "write_zeroes": true, 00:33:48.136 "zcopy": false, 00:33:48.136 "get_zone_info": false, 00:33:48.136 "zone_management": false, 00:33:48.136 "zone_append": false, 00:33:48.136 "compare": false, 00:33:48.136 "compare_and_write": false, 00:33:48.136 "abort": false, 00:33:48.136 "seek_hole": false, 00:33:48.136 "seek_data": false, 00:33:48.136 "copy": false, 00:33:48.136 "nvme_iov_md": false 00:33:48.136 }, 00:33:48.136 "memory_domains": [ 00:33:48.136 { 00:33:48.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:48.136 "dma_device_type": 2 00:33:48.136 } 00:33:48.136 ], 00:33:48.136 "driver_specific": { 00:33:48.136 "crypto": { 00:33:48.136 "base_bdev_name": "EE_base0", 00:33:48.136 "name": "crypt0", 00:33:48.136 "key_name": "test_dek_sw" 00:33:48.136 } 00:33:48.136 } 00:33:48.136 } 00:33:48.136 ] 00:33:48.136 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.136 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@907 -- # return 0 00:33:48.136 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=1825399 00:33:48.136 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:33:48.136 07:39:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:48.136 Running I/O for 5 seconds... 00:33:49.068 07:39:21 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:33:49.068 07:39:21 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:49.068 07:39:21 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:49.068 07:39:21 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:49.068 07:39:21 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 1825399 00:33:53.249 00:33:53.249 Latency(us) 00:33:53.249 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:53.249 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:33:53.249 crypt0 : 5.00 39097.19 152.72 0.00 0.00 814.96 380.11 1074.79 00:33:53.249 =================================================================================================================== 00:33:53.250 Total : 39097.19 152.72 0.00 0.00 814.96 380.11 1074.79 00:33:53.250 0 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 1825318 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # '[' -z 1825318 ']' 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # kill -0 1825318 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # uname 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1825318 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1825318' 00:33:53.250 killing process with pid 1825318 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@969 -- # kill 1825318 00:33:53.250 Received shutdown signal, test time was about 5.000000 seconds 00:33:53.250 00:33:53.250 Latency(us) 00:33:53.250 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:53.250 =================================================================================================================== 00:33:53.250 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:53.250 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@974 -- # wait 1825318 00:33:53.508 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:33:53.508 00:33:53.508 real 0m6.406s 00:33:53.508 user 0m6.658s 00:33:53.508 sys 0m0.359s 00:33:53.508 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:53.508 07:39:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:53.508 ************************************ 00:33:53.508 END TEST bdev_crypto_enomem 00:33:53.508 ************************************ 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:33:53.508 07:39:25 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:33:53.508 00:33:53.508 real 0m53.414s 00:33:53.508 user 1m48.495s 00:33:53.508 sys 0m6.211s 00:33:53.508 07:39:25 blockdev_crypto_sw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:53.508 07:39:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:53.508 ************************************ 00:33:53.508 END TEST blockdev_crypto_sw 00:33:53.508 ************************************ 00:33:53.508 07:39:25 -- spdk/autotest.sh@363 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:53.508 07:39:25 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:33:53.508 07:39:25 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:53.508 07:39:25 -- common/autotest_common.sh@10 -- # set +x 00:33:53.508 ************************************ 00:33:53.508 START TEST blockdev_crypto_qat 00:33:53.508 ************************************ 00:33:53.508 07:39:25 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:53.767 * Looking for test storage... 00:33:53.767 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1826445 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1826445 00:33:53.767 07:39:26 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:53.767 07:39:26 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # '[' -z 1826445 ']' 00:33:53.767 07:39:26 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:53.767 07:39:26 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:53.767 07:39:26 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:53.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:53.767 07:39:26 blockdev_crypto_qat -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:53.767 07:39:26 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:53.767 [2024-07-25 07:39:26.176069] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:53.767 [2024-07-25 07:39:26.176152] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1826445 ] 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:53.767 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.767 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:53.768 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:53.768 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:54.026 [2024-07-25 07:39:26.308870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:54.026 [2024-07-25 07:39:26.395611] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:54.592 07:39:27 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:54.592 07:39:27 blockdev_crypto_qat -- common/autotest_common.sh@864 -- # return 0 00:33:54.593 07:39:27 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:33:54.593 07:39:27 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:33:54.593 07:39:27 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:33:54.593 07:39:27 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:54.593 07:39:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:54.593 [2024-07-25 07:39:27.077734] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:54.593 [2024-07-25 07:39:27.085769] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:54.593 [2024-07-25 07:39:27.093787] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:54.851 [2024-07-25 07:39:27.160793] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:57.384 true 00:33:57.384 true 00:33:57.384 true 00:33:57.384 true 00:33:57.384 Malloc0 00:33:57.384 Malloc1 00:33:57.384 Malloc2 00:33:57.384 Malloc3 00:33:57.384 [2024-07-25 07:39:29.484710] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:57.384 crypto_ram 00:33:57.384 [2024-07-25 07:39:29.492732] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:57.384 crypto_ram1 00:33:57.384 [2024-07-25 07:39:29.500755] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:57.384 crypto_ram2 00:33:57.384 [2024-07-25 07:39:29.508774] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:57.384 crypto_ram3 00:33:57.384 [ 00:33:57.384 { 00:33:57.384 "name": "Malloc1", 00:33:57.384 "aliases": [ 00:33:57.384 "9a22a91a-3398-45d7-8ef6-777fdfc8b4c0" 00:33:57.384 ], 00:33:57.384 "product_name": "Malloc disk", 00:33:57.384 "block_size": 512, 00:33:57.384 "num_blocks": 65536, 00:33:57.384 "uuid": "9a22a91a-3398-45d7-8ef6-777fdfc8b4c0", 00:33:57.384 "assigned_rate_limits": { 00:33:57.384 "rw_ios_per_sec": 0, 00:33:57.384 "rw_mbytes_per_sec": 0, 00:33:57.384 "r_mbytes_per_sec": 0, 00:33:57.384 "w_mbytes_per_sec": 0 00:33:57.384 }, 00:33:57.384 "claimed": true, 00:33:57.384 "claim_type": "exclusive_write", 00:33:57.384 "zoned": false, 00:33:57.384 "supported_io_types": { 00:33:57.384 "read": true, 00:33:57.384 "write": true, 00:33:57.384 "unmap": true, 00:33:57.384 "flush": true, 00:33:57.384 "reset": true, 00:33:57.384 "nvme_admin": false, 00:33:57.384 "nvme_io": false, 00:33:57.384 "nvme_io_md": false, 00:33:57.384 "write_zeroes": true, 00:33:57.384 "zcopy": true, 00:33:57.384 "get_zone_info": false, 00:33:57.384 "zone_management": false, 00:33:57.384 "zone_append": false, 00:33:57.384 "compare": false, 00:33:57.384 "compare_and_write": false, 00:33:57.384 "abort": true, 00:33:57.384 "seek_hole": false, 00:33:57.384 "seek_data": false, 00:33:57.384 "copy": true, 00:33:57.384 "nvme_iov_md": false 00:33:57.384 }, 00:33:57.384 "memory_domains": [ 00:33:57.384 { 00:33:57.384 "dma_device_id": "system", 00:33:57.384 "dma_device_type": 1 00:33:57.384 }, 00:33:57.384 { 00:33:57.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:57.384 "dma_device_type": 2 00:33:57.384 } 00:33:57.384 ], 00:33:57.384 "driver_specific": {} 00:33:57.384 } 00:33:57.384 ] 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:33:57.384 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:57.384 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa4872c4-3416-585e-86af-1fe97b77e7e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "aa4872c4-3416-585e-86af-1fe97b77e7e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "31487710-cae9-5f1b-b5ab-129939a6e5e9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "31487710-cae9-5f1b-b5ab-129939a6e5e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5c7ec68f-5545-5549-9209-28b60bcf451e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5c7ec68f-5545-5549-9209-28b60bcf451e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a92853ee-eed4-5ea0-803f-b82410a83346"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a92853ee-eed4-5ea0-803f-b82410a83346",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:33:57.385 07:39:29 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 1826445 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # '[' -z 1826445 ']' 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # kill -0 1826445 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # uname 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1826445 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1826445' 00:33:57.385 killing process with pid 1826445 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@969 -- # kill 1826445 00:33:57.385 07:39:29 blockdev_crypto_qat -- common/autotest_common.sh@974 -- # wait 1826445 00:33:57.952 07:39:30 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:57.952 07:39:30 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:57.952 07:39:30 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:33:57.952 07:39:30 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:57.952 07:39:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.952 ************************************ 00:33:57.952 START TEST bdev_hello_world 00:33:57.952 ************************************ 00:33:57.952 07:39:30 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:57.952 [2024-07-25 07:39:30.366614] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:33:57.952 [2024-07-25 07:39:30.366672] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1827118 ] 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:57.952 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:57.952 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:58.212 [2024-07-25 07:39:30.496363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:58.212 [2024-07-25 07:39:30.579403] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:33:58.212 [2024-07-25 07:39:30.600651] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:58.212 [2024-07-25 07:39:30.608672] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:58.212 [2024-07-25 07:39:30.616690] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:58.212 [2024-07-25 07:39:30.721299] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:00.746 [2024-07-25 07:39:32.885744] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:00.746 [2024-07-25 07:39:32.885808] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:00.746 [2024-07-25 07:39:32.885822] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.746 [2024-07-25 07:39:32.893764] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:00.746 [2024-07-25 07:39:32.893781] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:00.746 [2024-07-25 07:39:32.893792] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.746 [2024-07-25 07:39:32.901783] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:00.746 [2024-07-25 07:39:32.901799] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:00.746 [2024-07-25 07:39:32.901809] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.746 [2024-07-25 07:39:32.909804] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:00.746 [2024-07-25 07:39:32.909820] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:00.746 [2024-07-25 07:39:32.909831] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.746 [2024-07-25 07:39:32.980960] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:00.746 [2024-07-25 07:39:32.980999] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:00.746 [2024-07-25 07:39:32.981016] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:00.746 [2024-07-25 07:39:32.982187] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:00.746 [2024-07-25 07:39:32.982254] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:00.746 [2024-07-25 07:39:32.982269] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:00.746 [2024-07-25 07:39:32.982308] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:00.746 00:34:00.746 [2024-07-25 07:39:32.982325] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:01.004 00:34:01.004 real 0m2.982s 00:34:01.004 user 0m2.640s 00:34:01.004 sys 0m0.308s 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:01.004 ************************************ 00:34:01.004 END TEST bdev_hello_world 00:34:01.004 ************************************ 00:34:01.004 07:39:33 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:34:01.004 07:39:33 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:34:01.004 07:39:33 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:01.004 07:39:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:01.004 ************************************ 00:34:01.004 START TEST bdev_bounds 00:34:01.004 ************************************ 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1827587 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1827587' 00:34:01.004 Process bdevio pid: 1827587 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1827587 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1827587 ']' 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:01.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:01.004 07:39:33 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:01.004 [2024-07-25 07:39:33.428937] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:34:01.004 [2024-07-25 07:39:33.428994] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1827587 ] 00:34:01.004 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:01.005 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:01.005 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:01.263 [2024-07-25 07:39:33.562242] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:01.263 [2024-07-25 07:39:33.651497] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:34:01.263 [2024-07-25 07:39:33.651592] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:34:01.263 [2024-07-25 07:39:33.651596] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:34:01.263 [2024-07-25 07:39:33.672886] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:01.263 [2024-07-25 07:39:33.680916] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:01.263 [2024-07-25 07:39:33.688937] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:01.263 [2024-07-25 07:39:33.793893] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:03.831 [2024-07-25 07:39:35.947028] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:03.831 [2024-07-25 07:39:35.947102] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:03.831 [2024-07-25 07:39:35.947115] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:03.831 [2024-07-25 07:39:35.955047] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:03.831 [2024-07-25 07:39:35.955065] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:03.831 [2024-07-25 07:39:35.955075] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:03.831 [2024-07-25 07:39:35.963066] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:03.831 [2024-07-25 07:39:35.963082] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:03.831 [2024-07-25 07:39:35.963093] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:03.831 [2024-07-25 07:39:35.971087] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:03.831 [2024-07-25 07:39:35.971103] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:03.831 [2024-07-25 07:39:35.971113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:03.831 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:03.831 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:34:03.831 07:39:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:03.831 I/O targets: 00:34:03.831 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:03.831 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:03.831 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:03.831 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:03.831 00:34:03.831 00:34:03.831 CUnit - A unit testing framework for C - Version 2.1-3 00:34:03.831 http://cunit.sourceforge.net/ 00:34:03.831 00:34:03.831 00:34:03.831 Suite: bdevio tests on: crypto_ram3 00:34:03.831 Test: blockdev write read block ...passed 00:34:03.831 Test: blockdev write zeroes read block ...passed 00:34:03.831 Test: blockdev write zeroes read no split ...passed 00:34:03.831 Test: blockdev write zeroes read split ...passed 00:34:03.831 Test: blockdev write zeroes read split partial ...passed 00:34:03.831 Test: blockdev reset ...passed 00:34:03.831 Test: blockdev write read 8 blocks ...passed 00:34:03.831 Test: blockdev write read size > 128k ...passed 00:34:03.831 Test: blockdev write read invalid size ...passed 00:34:03.831 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:03.831 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:03.832 Test: blockdev write read max offset ...passed 00:34:03.832 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:03.832 Test: blockdev writev readv 8 blocks ...passed 00:34:03.832 Test: blockdev writev readv 30 x 1block ...passed 00:34:03.832 Test: blockdev writev readv block ...passed 00:34:03.832 Test: blockdev writev readv size > 128k ...passed 00:34:03.832 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:03.832 Test: blockdev comparev and writev ...passed 00:34:03.832 Test: blockdev nvme passthru rw ...passed 00:34:03.832 Test: blockdev nvme passthru vendor specific ...passed 00:34:03.832 Test: blockdev nvme admin passthru ...passed 00:34:03.832 Test: blockdev copy ...passed 00:34:03.832 Suite: bdevio tests on: crypto_ram2 00:34:03.832 Test: blockdev write read block ...passed 00:34:03.832 Test: blockdev write zeroes read block ...passed 00:34:03.832 Test: blockdev write zeroes read no split ...passed 00:34:03.832 Test: blockdev write zeroes read split ...passed 00:34:03.832 Test: blockdev write zeroes read split partial ...passed 00:34:03.832 Test: blockdev reset ...passed 00:34:03.832 Test: blockdev write read 8 blocks ...passed 00:34:03.832 Test: blockdev write read size > 128k ...passed 00:34:03.832 Test: blockdev write read invalid size ...passed 00:34:03.832 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:03.832 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:03.832 Test: blockdev write read max offset ...passed 00:34:03.832 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:03.832 Test: blockdev writev readv 8 blocks ...passed 00:34:03.832 Test: blockdev writev readv 30 x 1block ...passed 00:34:03.832 Test: blockdev writev readv block ...passed 00:34:03.832 Test: blockdev writev readv size > 128k ...passed 00:34:03.832 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:03.832 Test: blockdev comparev and writev ...passed 00:34:03.832 Test: blockdev nvme passthru rw ...passed 00:34:03.832 Test: blockdev nvme passthru vendor specific ...passed 00:34:03.832 Test: blockdev nvme admin passthru ...passed 00:34:03.832 Test: blockdev copy ...passed 00:34:03.832 Suite: bdevio tests on: crypto_ram1 00:34:03.832 Test: blockdev write read block ...passed 00:34:03.832 Test: blockdev write zeroes read block ...passed 00:34:03.832 Test: blockdev write zeroes read no split ...passed 00:34:03.832 Test: blockdev write zeroes read split ...passed 00:34:03.832 Test: blockdev write zeroes read split partial ...passed 00:34:03.832 Test: blockdev reset ...passed 00:34:03.832 Test: blockdev write read 8 blocks ...passed 00:34:03.832 Test: blockdev write read size > 128k ...passed 00:34:03.832 Test: blockdev write read invalid size ...passed 00:34:03.832 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:03.832 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:03.832 Test: blockdev write read max offset ...passed 00:34:03.832 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:03.832 Test: blockdev writev readv 8 blocks ...passed 00:34:03.832 Test: blockdev writev readv 30 x 1block ...passed 00:34:03.832 Test: blockdev writev readv block ...passed 00:34:03.832 Test: blockdev writev readv size > 128k ...passed 00:34:03.832 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:03.832 Test: blockdev comparev and writev ...passed 00:34:03.832 Test: blockdev nvme passthru rw ...passed 00:34:03.832 Test: blockdev nvme passthru vendor specific ...passed 00:34:03.832 Test: blockdev nvme admin passthru ...passed 00:34:03.832 Test: blockdev copy ...passed 00:34:03.832 Suite: bdevio tests on: crypto_ram 00:34:03.832 Test: blockdev write read block ...passed 00:34:03.832 Test: blockdev write zeroes read block ...passed 00:34:03.832 Test: blockdev write zeroes read no split ...passed 00:34:04.091 Test: blockdev write zeroes read split ...passed 00:34:04.091 Test: blockdev write zeroes read split partial ...passed 00:34:04.091 Test: blockdev reset ...passed 00:34:04.091 Test: blockdev write read 8 blocks ...passed 00:34:04.091 Test: blockdev write read size > 128k ...passed 00:34:04.091 Test: blockdev write read invalid size ...passed 00:34:04.091 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:04.091 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:04.091 Test: blockdev write read max offset ...passed 00:34:04.091 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:04.091 Test: blockdev writev readv 8 blocks ...passed 00:34:04.091 Test: blockdev writev readv 30 x 1block ...passed 00:34:04.091 Test: blockdev writev readv block ...passed 00:34:04.091 Test: blockdev writev readv size > 128k ...passed 00:34:04.091 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:04.091 Test: blockdev comparev and writev ...passed 00:34:04.091 Test: blockdev nvme passthru rw ...passed 00:34:04.091 Test: blockdev nvme passthru vendor specific ...passed 00:34:04.091 Test: blockdev nvme admin passthru ...passed 00:34:04.091 Test: blockdev copy ...passed 00:34:04.091 00:34:04.091 Run Summary: Type Total Ran Passed Failed Inactive 00:34:04.091 suites 4 4 n/a 0 0 00:34:04.091 tests 92 92 92 0 0 00:34:04.091 asserts 520 520 520 0 n/a 00:34:04.091 00:34:04.091 Elapsed time = 0.496 seconds 00:34:04.091 0 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1827587 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1827587 ']' 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1827587 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1827587 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1827587' 00:34:04.091 killing process with pid 1827587 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1827587 00:34:04.091 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1827587 00:34:04.351 07:39:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:34:04.351 00:34:04.351 real 0m3.441s 00:34:04.351 user 0m9.644s 00:34:04.351 sys 0m0.526s 00:34:04.351 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:04.351 07:39:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:04.351 ************************************ 00:34:04.351 END TEST bdev_bounds 00:34:04.351 ************************************ 00:34:04.351 07:39:36 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:04.351 07:39:36 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:34:04.351 07:39:36 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:04.351 07:39:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:04.610 ************************************ 00:34:04.610 START TEST bdev_nbd 00:34:04.610 ************************************ 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1828245 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1828245 /var/tmp/spdk-nbd.sock 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1828245 ']' 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:04.610 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:04.610 07:39:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:04.610 [2024-07-25 07:39:36.966462] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:34:04.610 [2024-07-25 07:39:36.966519] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:04.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.610 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:04.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.610 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:04.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.610 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:04.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.610 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:04.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.610 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:04.610 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:04.611 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:04.611 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:04.611 [2024-07-25 07:39:37.098989] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:04.870 [2024-07-25 07:39:37.185531] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:34:04.870 [2024-07-25 07:39:37.206776] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:04.870 [2024-07-25 07:39:37.214797] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:04.870 [2024-07-25 07:39:37.222815] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:04.870 [2024-07-25 07:39:37.324233] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:07.406 [2024-07-25 07:39:39.482053] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:07.406 [2024-07-25 07:39:39.482105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:07.406 [2024-07-25 07:39:39.482119] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.406 [2024-07-25 07:39:39.490072] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:07.406 [2024-07-25 07:39:39.490089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:07.406 [2024-07-25 07:39:39.490100] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.406 [2024-07-25 07:39:39.498092] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:07.406 [2024-07-25 07:39:39.498109] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:07.406 [2024-07-25 07:39:39.498119] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.406 [2024-07-25 07:39:39.506112] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:07.406 [2024-07-25 07:39:39.506127] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:07.406 [2024-07-25 07:39:39.506146] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:07.406 1+0 records in 00:34:07.406 1+0 records out 00:34:07.406 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328494 s, 12.5 MB/s 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:07.406 07:39:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:07.665 1+0 records in 00:34:07.665 1+0 records out 00:34:07.665 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313427 s, 13.1 MB/s 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:07.665 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:07.666 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:07.666 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:07.666 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:07.666 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:07.666 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:07.924 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:07.925 1+0 records in 00:34:07.925 1+0 records out 00:34:07.925 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324624 s, 12.6 MB/s 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:07.925 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:08.184 1+0 records in 00:34:08.184 1+0 records out 00:34:08.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028782 s, 14.2 MB/s 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:08.184 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd0", 00:34:08.442 "bdev_name": "crypto_ram" 00:34:08.442 }, 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd1", 00:34:08.442 "bdev_name": "crypto_ram1" 00:34:08.442 }, 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd2", 00:34:08.442 "bdev_name": "crypto_ram2" 00:34:08.442 }, 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd3", 00:34:08.442 "bdev_name": "crypto_ram3" 00:34:08.442 } 00:34:08.442 ]' 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd0", 00:34:08.442 "bdev_name": "crypto_ram" 00:34:08.442 }, 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd1", 00:34:08.442 "bdev_name": "crypto_ram1" 00:34:08.442 }, 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd2", 00:34:08.442 "bdev_name": "crypto_ram2" 00:34:08.442 }, 00:34:08.442 { 00:34:08.442 "nbd_device": "/dev/nbd3", 00:34:08.442 "bdev_name": "crypto_ram3" 00:34:08.442 } 00:34:08.442 ]' 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:08.442 07:39:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:08.701 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:08.959 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:09.215 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:09.215 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:09.215 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:09.215 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:09.216 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:09.216 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:09.216 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:09.216 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:09.216 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:09.216 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:09.472 07:39:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:09.730 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:09.731 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:09.989 /dev/nbd0 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:09.989 1+0 records in 00:34:09.989 1+0 records out 00:34:09.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294648 s, 13.9 MB/s 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:09.989 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:34:10.247 /dev/nbd1 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:10.247 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:10.248 1+0 records in 00:34:10.248 1+0 records out 00:34:10.248 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306736 s, 13.4 MB/s 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:10.248 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:34:10.506 /dev/nbd10 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:10.506 1+0 records in 00:34:10.506 1+0 records out 00:34:10.506 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295972 s, 13.8 MB/s 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:10.506 07:39:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:34:10.765 /dev/nbd11 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:10.765 1+0 records in 00:34:10.765 1+0 records out 00:34:10.765 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216073 s, 19.0 MB/s 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:10.765 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd0", 00:34:11.024 "bdev_name": "crypto_ram" 00:34:11.024 }, 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd1", 00:34:11.024 "bdev_name": "crypto_ram1" 00:34:11.024 }, 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd10", 00:34:11.024 "bdev_name": "crypto_ram2" 00:34:11.024 }, 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd11", 00:34:11.024 "bdev_name": "crypto_ram3" 00:34:11.024 } 00:34:11.024 ]' 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd0", 00:34:11.024 "bdev_name": "crypto_ram" 00:34:11.024 }, 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd1", 00:34:11.024 "bdev_name": "crypto_ram1" 00:34:11.024 }, 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd10", 00:34:11.024 "bdev_name": "crypto_ram2" 00:34:11.024 }, 00:34:11.024 { 00:34:11.024 "nbd_device": "/dev/nbd11", 00:34:11.024 "bdev_name": "crypto_ram3" 00:34:11.024 } 00:34:11.024 ]' 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:11.024 /dev/nbd1 00:34:11.024 /dev/nbd10 00:34:11.024 /dev/nbd11' 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:11.024 /dev/nbd1 00:34:11.024 /dev/nbd10 00:34:11.024 /dev/nbd11' 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:11.024 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:11.025 256+0 records in 00:34:11.025 256+0 records out 00:34:11.025 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011255 s, 93.2 MB/s 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:11.025 256+0 records in 00:34:11.025 256+0 records out 00:34:11.025 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.058673 s, 17.9 MB/s 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:11.025 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:11.283 256+0 records in 00:34:11.283 256+0 records out 00:34:11.283 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0413797 s, 25.3 MB/s 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:11.283 256+0 records in 00:34:11.283 256+0 records out 00:34:11.283 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0341789 s, 30.7 MB/s 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:11.283 256+0 records in 00:34:11.283 256+0 records out 00:34:11.283 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0477053 s, 22.0 MB/s 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:11.283 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:11.541 07:39:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:11.800 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:12.058 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:12.316 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:12.574 07:39:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:12.833 malloc_lvol_verify 00:34:12.833 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:13.092 7535a2ce-f306-4a03-a770-440151949f0d 00:34:13.092 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:13.350 0f393d1c-5709-4473-943f-b7cb4380a64d 00:34:13.350 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:13.351 /dev/nbd0 00:34:13.351 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:13.351 mke2fs 1.46.5 (30-Dec-2021) 00:34:13.609 Discarding device blocks: 0/4096 done 00:34:13.609 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:13.609 00:34:13.609 Allocating group tables: 0/1 done 00:34:13.609 Writing inode tables: 0/1 done 00:34:13.609 Creating journal (1024 blocks): done 00:34:13.609 Writing superblocks and filesystem accounting information: 0/1 done 00:34:13.609 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:13.609 07:39:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1828245 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1828245 ']' 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1828245 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:34:13.609 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:13.868 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1828245 00:34:13.868 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:13.868 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:13.868 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1828245' 00:34:13.868 killing process with pid 1828245 00:34:13.868 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1828245 00:34:13.868 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1828245 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:34:14.126 00:34:14.126 real 0m9.615s 00:34:14.126 user 0m12.574s 00:34:14.126 sys 0m3.783s 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:14.126 ************************************ 00:34:14.126 END TEST bdev_nbd 00:34:14.126 ************************************ 00:34:14.126 07:39:46 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:34:14.126 07:39:46 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:34:14.126 07:39:46 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:34:14.126 07:39:46 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:34:14.126 07:39:46 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:34:14.126 07:39:46 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:14.126 07:39:46 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:14.126 ************************************ 00:34:14.126 START TEST bdev_fio 00:34:14.126 ************************************ 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:14.126 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:14.126 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:14.127 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:14.127 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:14.127 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:14.385 ************************************ 00:34:14.385 START TEST bdev_fio_rw_verify 00:34:14.385 ************************************ 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:14.385 07:39:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.644 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:14.644 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:14.644 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:14.644 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:14.644 fio-3.35 00:34:14.644 Starting 4 threads 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:14.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:14.902 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:29.777 00:34:29.777 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1830641: Thu Jul 25 07:39:59 2024 00:34:29.777 read: IOPS=22.8k, BW=89.3MiB/s (93.6MB/s)(893MiB/10001msec) 00:34:29.777 slat (usec): min=11, max=1305, avg=59.78, stdev=29.57 00:34:29.777 clat (usec): min=20, max=1797, avg=342.96, stdev=205.36 00:34:29.777 lat (usec): min=56, max=1842, avg=402.74, stdev=218.90 00:34:29.777 clat percentiles (usec): 00:34:29.777 | 50.000th=[ 289], 99.000th=[ 955], 99.900th=[ 1188], 99.990th=[ 1336], 00:34:29.777 | 99.999th=[ 1369] 00:34:29.777 write: IOPS=25.1k, BW=97.9MiB/s (103MB/s)(951MiB/9717msec); 0 zone resets 00:34:29.777 slat (usec): min=17, max=416, avg=71.10, stdev=29.77 00:34:29.777 clat (usec): min=16, max=1638, avg=384.45, stdev=217.97 00:34:29.777 lat (usec): min=45, max=1851, avg=455.55, stdev=232.06 00:34:29.777 clat percentiles (usec): 00:34:29.777 | 50.000th=[ 338], 99.000th=[ 1045], 99.900th=[ 1254], 99.990th=[ 1352], 00:34:29.777 | 99.999th=[ 1450] 00:34:29.777 bw ( KiB/s): min=82952, max=148288, per=97.46%, avg=97697.26, stdev=3896.77, samples=76 00:34:29.777 iops : min=20738, max=37072, avg=24424.32, stdev=974.19, samples=76 00:34:29.777 lat (usec) : 20=0.01%, 50=0.02%, 100=3.42%, 250=32.94%, 500=40.61% 00:34:29.777 lat (usec) : 750=16.95%, 1000=5.00% 00:34:29.777 lat (msec) : 2=1.07% 00:34:29.777 cpu : usr=99.63%, sys=0.00%, ctx=92, majf=0, minf=283 00:34:29.777 IO depths : 1=2.7%, 2=27.8%, 4=55.6%, 8=13.9%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:29.777 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:29.777 complete : 0=0.0%, 4=87.8%, 8=12.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:29.777 issued rwts: total=228516,243517,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:29.777 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:29.777 00:34:29.777 Run status group 0 (all jobs): 00:34:29.777 READ: bw=89.3MiB/s (93.6MB/s), 89.3MiB/s-89.3MiB/s (93.6MB/s-93.6MB/s), io=893MiB (936MB), run=10001-10001msec 00:34:29.777 WRITE: bw=97.9MiB/s (103MB/s), 97.9MiB/s-97.9MiB/s (103MB/s-103MB/s), io=951MiB (997MB), run=9717-9717msec 00:34:29.777 00:34:29.777 real 0m13.433s 00:34:29.777 user 0m54.586s 00:34:29.777 sys 0m0.500s 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:29.777 ************************************ 00:34:29.777 END TEST bdev_fio_rw_verify 00:34:29.777 ************************************ 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:29.777 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa4872c4-3416-585e-86af-1fe97b77e7e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "aa4872c4-3416-585e-86af-1fe97b77e7e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "31487710-cae9-5f1b-b5ab-129939a6e5e9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "31487710-cae9-5f1b-b5ab-129939a6e5e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5c7ec68f-5545-5549-9209-28b60bcf451e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5c7ec68f-5545-5549-9209-28b60bcf451e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a92853ee-eed4-5ea0-803f-b82410a83346"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a92853ee-eed4-5ea0-803f-b82410a83346",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:34:29.778 crypto_ram1 00:34:29.778 crypto_ram2 00:34:29.778 crypto_ram3 ]] 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa4872c4-3416-585e-86af-1fe97b77e7e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "aa4872c4-3416-585e-86af-1fe97b77e7e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "31487710-cae9-5f1b-b5ab-129939a6e5e9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "31487710-cae9-5f1b-b5ab-129939a6e5e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5c7ec68f-5545-5549-9209-28b60bcf451e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5c7ec68f-5545-5549-9209-28b60bcf451e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a92853ee-eed4-5ea0-803f-b82410a83346"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a92853ee-eed4-5ea0-803f-b82410a83346",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:29.778 ************************************ 00:34:29.778 START TEST bdev_fio_trim 00:34:29.778 ************************************ 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:29.778 07:40:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.778 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.778 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.779 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.779 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.779 fio-3.35 00:34:29.779 Starting 4 threads 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:29.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:29.779 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:41.985 00:34:41.985 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1833036: Thu Jul 25 07:40:13 2024 00:34:41.985 write: IOPS=46.8k, BW=183MiB/s (192MB/s)(1830MiB/10001msec); 0 zone resets 00:34:41.985 slat (usec): min=15, max=189, avg=49.26, stdev=22.45 00:34:41.985 clat (usec): min=40, max=1548, avg=180.90, stdev=95.92 00:34:41.985 lat (usec): min=56, max=1597, avg=230.16, stdev=107.11 00:34:41.985 clat percentiles (usec): 00:34:41.985 | 50.000th=[ 165], 99.000th=[ 478], 99.900th=[ 537], 99.990th=[ 570], 00:34:41.985 | 99.999th=[ 660] 00:34:41.985 bw ( KiB/s): min=158880, max=245952, per=100.00%, avg=188740.26, stdev=9729.51, samples=76 00:34:41.985 iops : min=39720, max=61488, avg=47185.05, stdev=2432.37, samples=76 00:34:41.985 trim: IOPS=46.8k, BW=183MiB/s (192MB/s)(1830MiB/10001msec); 0 zone resets 00:34:41.985 slat (usec): min=5, max=172, avg=14.13, stdev= 5.27 00:34:41.985 clat (usec): min=17, max=1597, avg=230.30, stdev=107.11 00:34:41.985 lat (usec): min=31, max=1612, avg=244.43, stdev=108.82 00:34:41.985 clat percentiles (usec): 00:34:41.985 | 50.000th=[ 210], 99.000th=[ 570], 99.900th=[ 644], 99.990th=[ 668], 00:34:41.985 | 99.999th=[ 766] 00:34:41.985 bw ( KiB/s): min=158880, max=245952, per=100.00%, avg=188740.26, stdev=9729.51, samples=76 00:34:41.985 iops : min=39720, max=61488, avg=47185.05, stdev=2432.37, samples=76 00:34:41.985 lat (usec) : 20=0.01%, 50=0.96%, 100=12.99%, 250=57.36%, 500=27.15% 00:34:41.985 lat (usec) : 750=1.54%, 1000=0.01% 00:34:41.985 lat (msec) : 2=0.01% 00:34:41.985 cpu : usr=99.68%, sys=0.00%, ctx=58, majf=0, minf=99 00:34:41.985 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:41.985 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:41.985 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:41.985 issued rwts: total=0,468418,468420,0 short=0,0,0,0 dropped=0,0,0,0 00:34:41.985 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:41.985 00:34:41.985 Run status group 0 (all jobs): 00:34:41.985 WRITE: bw=183MiB/s (192MB/s), 183MiB/s-183MiB/s (192MB/s-192MB/s), io=1830MiB (1919MB), run=10001-10001msec 00:34:41.985 TRIM: bw=183MiB/s (192MB/s), 183MiB/s-183MiB/s (192MB/s-192MB/s), io=1830MiB (1919MB), run=10001-10001msec 00:34:41.985 00:34:41.985 real 0m13.460s 00:34:41.985 user 0m54.150s 00:34:41.985 sys 0m0.481s 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:41.985 ************************************ 00:34:41.985 END TEST bdev_fio_trim 00:34:41.985 ************************************ 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:34:41.985 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:34:41.985 00:34:41.985 real 0m27.249s 00:34:41.985 user 1m48.928s 00:34:41.985 sys 0m1.169s 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:41.985 ************************************ 00:34:41.985 END TEST bdev_fio 00:34:41.985 ************************************ 00:34:41.985 07:40:13 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:41.985 07:40:13 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:41.985 07:40:13 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:34:41.985 07:40:13 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:41.985 07:40:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:41.985 ************************************ 00:34:41.985 START TEST bdev_verify 00:34:41.985 ************************************ 00:34:41.985 07:40:13 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:41.985 [2024-07-25 07:40:13.985433] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:34:41.985 [2024-07-25 07:40:13.985489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1834809 ] 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:41.985 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.985 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:41.986 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.986 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:41.986 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:41.986 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:41.986 [2024-07-25 07:40:14.125084] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:41.986 [2024-07-25 07:40:14.240084] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:34:41.986 [2024-07-25 07:40:14.240092] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:34:41.986 [2024-07-25 07:40:14.261586] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:41.986 [2024-07-25 07:40:14.269607] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:41.986 [2024-07-25 07:40:14.277623] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:41.986 [2024-07-25 07:40:14.382890] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:44.514 [2024-07-25 07:40:16.548312] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:44.514 [2024-07-25 07:40:16.548394] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:44.514 [2024-07-25 07:40:16.548409] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:44.514 [2024-07-25 07:40:16.556332] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:44.514 [2024-07-25 07:40:16.556350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:44.514 [2024-07-25 07:40:16.556360] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:44.514 [2024-07-25 07:40:16.564353] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:44.514 [2024-07-25 07:40:16.564369] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:44.514 [2024-07-25 07:40:16.564379] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:44.514 [2024-07-25 07:40:16.572375] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:44.514 [2024-07-25 07:40:16.572390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:44.514 [2024-07-25 07:40:16.572401] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:44.514 Running I/O for 5 seconds... 00:34:49.778 00:34:49.778 Latency(us) 00:34:49.778 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:49.778 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x0 length 0x1000 00:34:49.778 crypto_ram : 5.07 530.19 2.07 0.00 0.00 240964.69 3617.59 161061.27 00:34:49.778 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x1000 length 0x1000 00:34:49.778 crypto_ram : 5.07 530.51 2.07 0.00 0.00 240778.51 4141.88 160222.41 00:34:49.778 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x0 length 0x1000 00:34:49.778 crypto_ram1 : 5.07 530.08 2.07 0.00 0.00 240423.74 3958.37 150994.94 00:34:49.778 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x1000 length 0x1000 00:34:49.778 crypto_ram1 : 5.07 530.40 2.07 0.00 0.00 240240.39 4561.31 150156.08 00:34:49.778 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x0 length 0x1000 00:34:49.778 crypto_ram2 : 5.05 4103.96 16.03 0.00 0.00 30949.26 5976.88 28730.98 00:34:49.778 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x1000 length 0x1000 00:34:49.778 crypto_ram2 : 5.05 4109.21 16.05 0.00 0.00 30904.83 8178.89 28730.98 00:34:49.778 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x0 length 0x1000 00:34:49.778 crypto_ram3 : 5.06 4100.65 16.02 0.00 0.00 30870.04 6920.60 26528.97 00:34:49.778 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:49.778 Verification LBA range: start 0x1000 length 0x1000 00:34:49.778 crypto_ram3 : 5.05 4116.60 16.08 0.00 0.00 30746.98 1861.22 26214.40 00:34:49.778 =================================================================================================================== 00:34:49.778 Total : 18551.61 72.47 0.00 0.00 54919.11 1861.22 161061.27 00:34:49.778 00:34:49.778 real 0m8.151s 00:34:49.778 user 0m15.471s 00:34:49.778 sys 0m0.344s 00:34:49.778 07:40:22 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:49.778 07:40:22 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:49.778 ************************************ 00:34:49.778 END TEST bdev_verify 00:34:49.778 ************************************ 00:34:49.778 07:40:22 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:49.778 07:40:22 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:34:49.778 07:40:22 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:49.778 07:40:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:49.778 ************************************ 00:34:49.778 START TEST bdev_verify_big_io 00:34:49.778 ************************************ 00:34:49.778 07:40:22 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:49.778 [2024-07-25 07:40:22.218234] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:34:49.778 [2024-07-25 07:40:22.218278] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1836152 ] 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:49.778 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.778 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:49.779 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:49.779 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:50.037 [2024-07-25 07:40:22.335436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:50.037 [2024-07-25 07:40:22.420755] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:34:50.037 [2024-07-25 07:40:22.420760] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:34:50.037 [2024-07-25 07:40:22.442089] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:50.037 [2024-07-25 07:40:22.450113] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:50.037 [2024-07-25 07:40:22.458131] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:50.037 [2024-07-25 07:40:22.555755] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:52.607 [2024-07-25 07:40:24.716299] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:52.607 [2024-07-25 07:40:24.716382] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:52.607 [2024-07-25 07:40:24.716400] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:52.607 [2024-07-25 07:40:24.724317] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:52.607 [2024-07-25 07:40:24.724337] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:52.607 [2024-07-25 07:40:24.724347] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:52.607 [2024-07-25 07:40:24.732338] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:52.607 [2024-07-25 07:40:24.732354] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:52.607 [2024-07-25 07:40:24.732364] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:52.607 [2024-07-25 07:40:24.740361] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:52.607 [2024-07-25 07:40:24.740376] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:52.607 [2024-07-25 07:40:24.740387] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:52.607 Running I/O for 5 seconds... 00:34:53.176 [2024-07-25 07:40:25.596037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.596458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.596808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.597774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.601646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.602052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.602073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.602087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.602102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.605924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.606329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.606346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.606360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.606374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.176 [2024-07-25 07:40:25.609409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.609456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.609494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.609532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.609956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.609997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.610038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.610076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.610491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.610507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.610521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.610535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.613663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.613707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.613746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.613784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.614768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.617905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.617952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.617990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.618992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.619005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.619018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.622988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.623002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.626368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.626423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.626463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.626534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.626947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.627577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.630613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.630657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.630695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.630732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.631735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.634773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.634858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.634911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.634949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.635914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.638747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.638791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.638831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.638900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.639365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.639407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.639445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.177 [2024-07-25 07:40:25.639483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.639894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.639910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.639924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.639938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.642853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.642896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.642933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.642971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.643949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.646861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.646905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.646942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.646985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.647948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.650959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.651991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.652005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.652019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.654814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.654857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.654895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.654933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.655941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.658850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.658891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.658929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.658966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.659947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.662732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.662776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.662815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.662854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.663761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.666532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.666575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.666617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.666656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.667601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.670323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.670368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.178 [2024-07-25 07:40:25.670407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.670444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.670809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.670851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.670895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.670943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.671280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.671296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.671310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.671323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.674901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.675265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.675281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.675294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.675308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.678734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.679156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.679173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.679187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.679201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.681851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.681894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.681932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.681969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.682469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.682511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.682550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.682588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.682981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.682997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.683010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.683024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.685543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.685585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.685629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.685666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.686633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.689989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.690048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.690466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.690482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.690496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.690510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.693852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.694271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.694288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.694302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.694317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.696960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.697692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.698091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.698106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.179 [2024-07-25 07:40:25.698122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.698136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.700642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.700684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.700722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.700760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.701692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.704365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.704408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.704446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.704488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.704896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.704949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.705005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.705042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.705382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.705397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.705411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.180 [2024-07-25 07:40:25.705424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.440 [2024-07-25 07:40:25.708035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.440 [2024-07-25 07:40:25.708079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.708857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.710611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.710662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.710699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.710736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.711495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.714801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.715046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.715061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.715074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.715087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.716767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.716829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.716866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.716898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.717607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.721681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.723043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.724531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.726075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.727947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.729494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.731022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.732345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.732758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.732773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.732787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.732800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.736357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.737896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.739427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.740150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.741734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.743267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.744759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.745125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.745539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.745556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.745570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.745585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.749283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.750825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.751718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.753313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.755090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.756622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.757053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.757415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.757822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.757838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.757856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.757869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.761513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.762801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.764179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.765470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.767257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.768088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.768450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.768807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.769222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.441 [2024-07-25 07:40:25.769238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.769252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.769267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.772607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.773500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.774777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.776313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.777899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.778267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.778623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.778979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.779397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.779414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.779427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.779441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.781892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.783414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.785062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.786591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.787232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.787597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.787953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.788316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.788720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.788735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.788748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.788761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.791638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.792937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.794461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.795984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.796726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.797084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.797450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.797807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.798059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.798074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.798087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.798100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.800978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.802507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.804040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.804842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.805626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.805985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.806345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.807660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.807967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.807981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.807995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.808008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.811207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.812744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.814010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.814375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.815137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.815499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.816340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.817622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.817874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.817889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.817902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.817915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.821019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.822536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.822900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.823264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.824004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.824368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.825892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.827551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.827804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.827819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.827832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.827845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.831036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.831819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.832185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.832543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.833387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.834843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.836188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.837715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.837966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.837981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.837994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.838007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.840888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.841262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.841625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.841984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.843433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.442 [2024-07-25 07:40:25.844720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.846257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.847782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.848125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.848147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.848161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.848174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.850034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.850400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.850757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.851115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.852706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.854221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.855738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.856799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.857050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.857065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.857078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.857091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.859092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.859461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.859823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.860489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.862329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.863961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.865446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.866616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.866900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.866915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.866928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.866941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.869118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.869507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.869866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.871359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.873151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.874669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.875422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.876711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.876962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.876978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.876991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.877004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.879249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.879613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.881107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.882484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.884275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.884985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.886430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.887985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.888240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.888256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.888268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.888281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.890762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.891851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.893143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.894662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.896045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.897592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.898975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.900487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.900741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.900756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.900769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.900782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.903719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.905009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.906539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.908067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.909340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.910618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.912134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.913670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.914038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.914056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.914069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.914083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.918529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.920198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.921750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.923194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.924785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.926322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.927854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.928903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.929305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.929321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.929335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.929349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.443 [2024-07-25 07:40:25.932782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.934326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.935870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.936575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.938433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.940054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.941550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.941912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.942322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.942338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.942351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.942365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.945770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.947301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.948109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.949637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.951412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.952930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.953326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.953684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.954074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.954089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.954103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.954116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.957416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.958713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.960074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.961363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.963152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.963994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.964358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.964714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.965180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.965196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.965210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.965224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.968347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.969199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.970483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.444 [2024-07-25 07:40:25.972008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.973613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.973974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.974334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.974689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.975094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.975110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.975124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.975143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.977370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.978853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.980436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.981975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.982592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.982951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.706 [2024-07-25 07:40:25.983312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.983676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.983967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.983982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.983995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.984008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.987036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.988475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.990019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.991689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.992416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.992776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.993134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.994062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.994359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.994375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.994388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.994401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.997178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:25.998712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.000224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.000848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.001635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.002006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.002478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.003795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.004048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.004063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.004079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.004092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.007170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.008702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.009742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.010106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.010878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.011244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.012530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.013816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.014066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.014081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.014094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.014107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.017186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.018767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.019145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.019503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.020224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.021100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.022389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.023792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.024043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.024058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.024072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.024085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.025942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.026308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.026665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.027023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.027729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.028091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.028452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.028809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.029224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.029240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.029253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.029266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.031726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.032091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.032463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.032828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.033583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.033944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.034308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.034671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.035054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.035069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.035083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.035097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.037721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.038086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.038448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.038495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.039306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.039677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.040044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.040408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.040822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.040837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.707 [2024-07-25 07:40:26.040850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.040868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.043354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.043714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.044071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.044439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.044491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.044916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.045290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.045649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.046004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.046370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.046710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.046725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.046739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.046753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.048959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.049695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.050054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.050070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.050083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.050097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.052974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.053012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.053342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.053358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.053371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.053385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.055776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.055830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.055868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.055919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.056918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.059699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.060145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.060165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.060180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.060194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.062977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.063015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.063053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.063469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.063486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.063499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.063514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.065551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.065594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.065631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.065669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.066078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.066123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.066168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.066206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.708 [2024-07-25 07:40:26.066248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.066653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.066668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.066686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.066700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.068791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.068833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.068870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.068908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.069892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.071979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.072703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.073121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.073142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.073156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.073171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.075977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.076016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.076053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.076470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.076489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.076506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.076519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.078720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.078761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.078801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.078838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.079849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.081997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.082682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.083015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.083031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.083045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.083058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.085936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.086346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.086362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.086375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.086388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.088619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.088661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.088699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.088737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.089046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.089105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.089150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.709 [2024-07-25 07:40:26.089190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.089229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.089583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.089598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.089612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.089630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.092789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.093167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.093183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.093197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.093210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.095969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.096429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.096445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.096460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.096476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.098893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.098951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.098992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.099987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.100003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.100017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.100031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.102993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.103031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.103411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.103427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.103440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.103453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.105559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.105601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.105643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.105681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.106758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.108857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.108902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.108941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.108979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.109946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.710 [2024-07-25 07:40:26.112745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.112783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.113195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.113210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.113226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.113239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.115414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.115457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.115495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.115532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.115936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.115985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.116477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.118964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.119002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.119415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.119433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.119447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.119461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.120955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.120997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.121945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.123506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.123549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.123590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.123631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.124692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.126844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.127120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.127137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.127157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.127169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.128676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.711 [2024-07-25 07:40:26.128717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.128758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.128796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.129827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.131613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.131656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.131693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.131730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.131974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.132535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.133937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.133988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.134618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.135024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.135040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.135054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.135068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.136960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.137791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.139897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.140337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.140354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.140368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.140382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.142380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.142421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.143956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.143999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.144818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.146228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.146269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.146314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.146672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.147704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.150928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.151804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.712 [2024-07-25 07:40:26.153365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.155011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.155265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.156794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.157165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.157522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.157879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.158287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.158304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.158317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.158334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.161177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.162491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.163771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.165313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.165563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.166431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.166792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.167154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.167511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.167908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.167923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.167936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.167949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.170062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.171368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.172888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.174416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.174745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.175115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.175477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.175835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.176197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.176444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.176459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.176472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.176485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.179447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.181068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.182602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.183981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.184340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.184704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.185062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.185423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.186800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.187086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.187101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.187114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.187127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.189882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.191424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.192951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.193319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.193740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.194105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.194471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.195323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.196610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.196857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.196872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.196885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.196898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.199885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.201428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.202216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.202575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.202982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.203350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.203708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.205154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.206723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.206971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.206986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.206999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.207012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.210055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.211349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.211708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.212066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.212469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.212834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.214297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.215623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.217163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.217411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.217430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.217443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.217456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.220581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.220953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.221315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.713 [2024-07-25 07:40:26.221671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.222083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.223100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.224376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.225912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.227448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.227841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.227857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.227870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.227884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.229904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.230270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.230629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.230986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.231341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.232571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.234090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.714 [2024-07-25 07:40:26.235615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.236802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.237067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.237082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.237095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.237108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.238942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.239308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.239671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.240169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.240418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.241864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.243423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.245101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.246086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.246372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.246388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.246401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.246414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.248346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.248710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.249068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.250662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.250966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.252513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.254034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.254746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.256037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.256291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.256317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.256330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.256343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.258463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.978 [2024-07-25 07:40:26.258829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.260002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.261293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.261543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.263085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.264053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.265741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.267314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.267563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.267578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.267591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.267604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.269868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.270630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.271902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.273430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.273680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.275115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.276396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.277678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.279209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.279457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.279472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.279485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.279498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.281930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.283430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.285043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.286568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.286817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.287617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.288896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.290424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.291956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.292272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.292289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.292306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.292320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.296222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.297676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.299243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.300925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.301274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.302552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.304079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.305612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.306767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.307162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.307179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.307192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.307205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.310525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.312062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.313598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.314316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.314566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.316106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.317774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.319354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.319713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.320109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.320127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.320146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.320161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.323515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.325024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.326023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.327709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.327968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.329516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.331047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.331503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.331861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.332249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.332266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.332280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.332293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.335628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.337092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.338319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.339602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.339852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.341398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.342363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.342736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.343092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.343547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.343564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.343578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.343592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.346659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.347373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.979 [2024-07-25 07:40:26.348661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.350187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.350438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.351892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.352258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.352617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.352974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.353400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.353417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.353431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.353445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.355641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.357118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.358703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.360239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.360500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.360870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.361233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.361591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.361949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.362235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.362251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.362264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.362278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.365182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.366509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.368031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.369594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.370039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.370415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.370773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.371129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.372103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.372395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.372411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.372424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.372441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.375179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.376691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.378220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.378889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.379347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.379715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.380072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.380615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.381886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.382133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.382155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.382168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.382181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.385268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.386803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.387723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.388100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.388519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.388884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.389247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.390818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.392495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.392743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.392757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.392770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.392783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.395825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.397103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.397467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.397830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.398273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.398637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.400137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.401507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.403039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.403292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.403308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.403321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.403334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.406549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.406912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.407273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.407630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.408042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.408710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.410270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.411797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.412659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.412979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.412994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.413007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.413020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.414986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.415353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.980 [2024-07-25 07:40:26.415711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.416072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.416521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.416903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.417265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.417622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.417986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.418452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.418468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.418481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.418494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.421156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.421522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.421882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.422245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.422627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.422991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.423357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.423721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.424080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.424494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.424510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.424524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.424539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.427046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.427417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.427775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.428135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.428588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.428960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.429324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.429679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.430040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.430439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.430455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.430469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.430486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.432930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.433299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.433661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.434020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.434453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.434817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.435178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.435537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.435899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.436301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.436317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.436330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.436345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.438858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.439226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.439583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.439940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.440364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.440731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.441094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.441459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.441816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.442254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.442273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.442287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.442301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.444789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.445156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.445202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.445560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.445922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.446297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.446655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.447011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.447373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.447714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.447729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.447743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.447756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.450247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.450610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.450979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.451033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.451485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.451848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.452210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.452570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.452940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.453285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.453302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.453316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.453329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.455680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.981 [2024-07-25 07:40:26.455736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.455783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.455822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.456781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.458938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.458980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.459752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.460169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.460185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.460199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.460213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.462987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.463026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.463447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.463464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.463477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.463491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.465713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.465767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.465805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.465842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.466852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.468949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.468991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.469661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.470089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.470105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.470120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.470133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.472883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.473298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.473314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.473329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.473343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.475387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.475429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.475484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.475523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.475973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.476560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.478674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.478716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.982 [2024-07-25 07:40:26.478756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.478794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.479734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.481924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.481966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.482659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.483114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.483129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.483147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.483161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.485952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.486001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.486332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.486348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.486361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.486375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.488923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.488970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.489659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.490020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.490036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.490049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.490062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.492910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.493350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.493366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.493380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.493395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.495591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.495647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.495687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.495725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.983 [2024-07-25 07:40:26.496744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.498670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.498725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.498762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.498814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.499666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.501774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.501816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.501854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.501892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.502962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.504952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.504995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:53.984 [2024-07-25 07:40:26.505793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.245 [2024-07-25 07:40:26.507327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.245 [2024-07-25 07:40:26.507367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.507855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.508218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.508234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.508247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.508261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.510939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.511186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.511202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.511214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.511228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.512742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.512798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.512837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.512874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.513636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.515896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.515938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.515981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.516735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.518782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.519023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.519038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.519051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.519064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.521895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.522209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.522225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.522241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.522254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.523734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.523775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.523819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.523860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.524565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.526479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.246 [2024-07-25 07:40:26.526520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.526561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.526601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.527490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.528972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.529838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.531774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.531816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.531854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.531892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.532888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.534811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.535056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.535071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.535084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.535097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.536891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.536934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.536975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.537970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.539987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.540024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.540297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.540313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.540330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.540343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.541891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.541932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.541970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.542009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.542425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.542470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.542509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.542547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.542600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.543031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.543048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.543061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.543077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.544674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.544714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.247 [2024-07-25 07:40:26.546271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.546319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.546654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.546703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.546741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.546778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.546815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.547095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.547109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.547122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.547135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.548750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.548793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.548831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.549201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.549641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.549696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.549735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.549773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.549811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.550214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.550229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.550242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.550255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.552588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.553862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.555398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.556932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.557256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.557629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.557985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.558360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.558809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.559058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.559073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.559086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.559101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.561869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.563402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.564920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.566127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.566479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.566847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.567212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.567570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.568910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.569207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.569222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.569235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.569248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.572205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.573803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.575490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.575851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.576278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.576646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.577004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.578188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.579467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.579716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.579731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.579744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.579757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.582739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.584267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.584763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.585123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.585509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.585879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.586350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.587665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.589196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.589445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.589460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.589473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.589490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.592603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.593533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.593907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.594268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.594715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.595079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.596639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.598314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.599870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.600119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.600133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.600152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.600165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.603097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.603466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.603827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.248 [2024-07-25 07:40:26.604192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.604606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.605733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.607017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.608543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.610070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.610416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.610432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.610445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.610458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.612248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.612610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.612967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.613325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.613625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.614901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.616424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.617935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.618750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.619001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.619016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.619030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.619043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.620862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.621240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.621603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.622312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.622572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.624236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.625829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.627304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.628490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.628772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.628787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.628800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.628813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.630898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.631284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.631751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.633069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.633326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.634858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.636515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.637473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.638761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.639012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.639027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.639040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.639053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.641222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.641587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.643162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.644605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.644857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.646402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.647116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.648419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.649941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.650199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.650215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.650230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.650244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.652632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.654048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.655332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.656849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.657101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.657891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.659365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.660969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.662489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.662738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.662753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.662767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.662780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.665691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.666984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.668514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.670042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.670370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.671690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.672972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.674515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.676045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.676421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.676438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.676452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.676467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.679866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.681410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.682938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.683907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.249 [2024-07-25 07:40:26.684166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.685447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.686979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.688518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.688978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.689405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.689423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.689437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.689451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.692885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.694522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.696081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.697209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.697504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.699054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.700557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.701541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.701902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.702330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.702346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.702362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.702376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.705921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.707509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.708378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.709671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.709924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.711470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.712756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.713120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.713489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.713894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.713909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.713923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.713937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.717102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.717923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.719431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.721063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.721320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.722853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.723230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.723592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.723954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.724370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.724387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.724401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.724415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.726930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.728594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.730122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.731770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.732024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.732545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.732906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.733273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.733634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.733987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.734002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.734016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.734029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.736324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.737611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.739157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.740683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.741034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.741416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.741771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.742129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.742652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.742902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.742917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.742931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.742945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.745787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.250 [2024-07-25 07:40:26.747313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.748830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.749720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.750177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.750545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.750905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.751273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.752955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.753220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.753235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.753248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.753261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.756289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.757968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.759503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.759862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.760279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.760642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.760997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.762188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.763459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.763710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.763726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.763738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.763752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.766842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.768444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.768815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.769183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.769647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.770018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.771572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.773247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.774934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.775280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.775296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.775309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.251 [2024-07-25 07:40:26.775323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.777275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.777646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.778004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.779504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.779816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.781403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.782932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.783519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.785165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.785416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.785430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.785443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.785456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.787659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.788029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.788399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.788765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.789175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.789541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.789898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.790260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.790620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.790990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.791010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.791023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.791037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.793518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.793891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.794257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.794616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.795024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.795395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.795755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.796118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.796490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.796912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.796928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.796942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.796958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.799720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.800085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.800453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.800814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.801235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.801619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.801978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.802346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.802708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.803077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.803093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.803106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.803120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.805578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.805947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.806316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.806680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.807091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.807465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.807825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.808190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.808553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.808967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.808982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.808996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.809009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.811523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.811910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.812277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.812641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.813051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.813428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.813802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.814169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.814526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.814952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.814968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.814982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.814996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.817434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.817802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.818170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.818536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.514 [2024-07-25 07:40:26.818935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.819311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.819673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.820030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.820402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.820810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.820825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.820838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.820851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.823444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.823813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.824179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.824536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.824944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.825315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.825675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.826036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.826406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.826823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.826839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.826853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.826868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.829306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.829667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.830024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.830393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.830804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.831180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.831555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.831913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.832285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.832703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.832724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.832738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.832752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.835218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.835584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.835652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.836013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.836402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.836771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.837130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.837496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.837856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.838260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.838276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.838290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.838303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.840866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.841239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.841603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.841648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.842059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.842433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.842793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.843156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.843527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.843917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.843933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.843947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.843960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.846902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.847318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.847335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.847349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.847363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.849450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.849494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.849532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.849570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.849947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.849991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.850565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.852875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.852919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.852957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.852995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.515 [2024-07-25 07:40:26.853419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.853969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.856821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.857288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.857305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.857319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.857333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.859542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.859585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.859623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.859662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.859996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.860642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.862764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.862807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.862846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.862886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.863944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.866841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.867123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.867144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.867157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.867171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.869735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.870155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.870173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.870187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.870202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.872928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.873334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.873351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.873364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.873378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.874782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.516 [2024-07-25 07:40:26.874823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.874860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.874904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.875801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.877566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.877610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.877649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.877688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.878679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.880811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.881059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.881074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.881088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.881104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.882741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.882784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.882821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.882860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.883925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.885565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.885607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.885644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.885691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.885939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.885984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.886475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.888775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.889127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.889150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.889163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.889177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.890967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.891871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.893334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.517 [2024-07-25 07:40:26.893384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.893422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.893460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.893833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.893891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.893930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.893969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.894011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.894415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.894432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.894446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.894460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.896856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.897101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.897116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.897129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.897149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.898693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.898735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.898773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.898810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.899813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.901770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.901816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.901856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.901894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.902596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.904698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.905064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.905079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.905092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.905105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.518 [2024-07-25 07:40:26.907959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.907972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.909594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.909637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.909673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.909711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.909953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.910514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.912862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.912904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.912944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.912982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.913745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.915843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.916091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.916106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.916119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.916133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.918812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.919058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.919073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.919086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.919099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.920675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.920718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.920755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.920792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.921494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.923656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.923699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.923741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.923779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.924610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.926987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.927000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.519 [2024-07-25 07:40:26.927013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.929992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.930007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.930021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.930034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.931591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.931632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.932907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.932951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.933662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.935872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.935917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.935956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.936901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.937642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.940691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.942216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.942582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.942942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.943320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.943689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.944283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.945555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.947077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.947332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.947348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.947361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.947375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.950488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.951324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.951690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.952052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.952503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.952870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.954484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.956103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.957731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.957980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.957994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.958007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.958021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.960835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.961210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.961572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.961932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.962350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.963658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.964938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.966467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.967993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.968417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.968433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.968446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.968459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.970263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.970631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.970992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.971358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.971658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.972945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.974463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.975984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.976817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.977068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.977083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.520 [2024-07-25 07:40:26.977096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.977110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.978999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.979372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.979733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.980613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.980914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.982470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.983997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.985272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.986656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.986945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.986960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.986973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.986986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.989080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.989463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.989898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.991248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.991498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.993020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.994609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.995508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.996783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.997032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.997050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.997064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.997077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.999289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:26.999654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.001217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.002646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.002896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.004443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.005150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.006463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.007985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.008240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.008255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.008268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.008282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.010611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.011821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.013090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.014610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.014861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.015872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.017552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.019087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.020750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.021001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.021016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.021029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.021042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.023903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.025182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.026717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.028249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.028519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.029721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.030998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.032531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.034063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.034448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.034464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.034477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.034490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.038261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.039826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.041353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.042676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.521 [2024-07-25 07:40:27.042985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.783 [2024-07-25 07:40:27.044275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.783 [2024-07-25 07:40:27.045794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.047330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.048167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.048600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.048615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.048631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.048645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.051951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.053498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.055094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.055982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.056273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.057816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.059359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.060558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.060921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.061329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.061346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.061360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.061374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.064746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.066277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.066994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.068286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.068535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.070111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.071792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.072165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.072529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.072852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.072868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.072882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.072895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.076191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.077314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.078844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.080245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.080498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.082048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.082652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.083013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.083378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.083810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.083826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.083847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.083863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.087029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.088136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.089411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.090942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.091199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.092292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.092658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.093017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.093398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.093814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.093830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.093843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.093858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.096019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.097324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.098695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.100237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.100579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.100951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.101316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.101685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.102108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.102365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.102381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.102394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.102408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.105218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.106747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.108281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.109209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.109621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.109990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.110355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.110715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.112368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.112634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.112648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.112661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.112675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.115897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.117435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.118754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.119116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.119527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.119895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.120261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.784 [2024-07-25 07:40:27.121453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.122732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.122985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.123000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.123014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.123026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.126094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.127631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.127993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.128358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.128734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.129098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.129468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.130992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.132515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.132766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.132780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.132794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.132808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.134727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.135092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.135462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.135823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.136074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.137358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.138894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.140431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.140833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.141083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.141098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.141111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.141124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.143066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.143451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.143814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.144181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.144628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.145010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.145376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.145736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.146097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.146457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.146475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.146494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.146508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.148978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.149352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.149714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.150078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.150495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.150864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.151229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.151590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.151953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.152350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.152366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.152380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.152393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.154855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.155234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.155593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.155968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.156385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.156756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.157118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.157498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.157856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.158276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.158295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.158309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.158323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.160776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.161158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.161521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.161891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.162285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.162663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.163028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.163393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.163755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.164147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.164164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.164178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.164191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.166861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.167240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.167621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.167978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.168346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.168716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.169076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.785 [2024-07-25 07:40:27.169457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.169824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.170240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.170258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.170272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.170286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.172713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.173078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.173446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.173807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.174178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.174550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.174913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.175283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.175642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.176039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.176057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.176071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.176085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.178551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.178917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.179281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.179642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.180055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.180431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.180790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.181156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.181515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.181841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.181856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.181870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.181883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.184130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.184499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.184866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.185233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.185486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.185896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.186260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.187950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.188324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.188743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.188758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.188772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.188790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.191087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.192506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.192875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.193239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.193688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.194210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.195487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.195846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.196654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.196914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.196930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.196943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.196956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.200042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.200413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.201470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.202221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.202632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.203001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.203369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.204488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.205171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.205578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.205598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.205612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.205626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.207921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.209278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.209709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.210065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.210328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.210966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.211332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.211691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.212054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.212310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.212326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.212339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.212352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.214777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.215153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.215205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.215689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.215939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.216316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.216676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.218190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.786 [2024-07-25 07:40:27.218548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.218964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.218979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.218992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.219007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.221624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.222832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.223197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.223242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.223609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.223981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.224770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.225771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.226134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.226458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.226474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.226487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.226500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.228527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.228571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.228608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.228661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.228911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.228956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.229571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.231602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.231658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.231707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.231745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.232563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.234496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.234539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.234578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.234616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.235657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.237532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.237574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.237611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.237648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.237994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.238621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.240882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.241296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.241313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.241330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.241344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.243965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.244003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.244040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.787 [2024-07-25 07:40:27.244455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.244471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.244485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.244499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.246746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.246790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.246828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.246866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.247895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.249864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.249915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.249956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.249995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.250743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.252856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.788 [2024-07-25 07:40:27.253279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.253295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.253309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.253323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.255787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.256031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.256047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.256061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.256074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.257578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.257621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.257658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.257695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.258675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.260648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.260690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.260730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.260767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.261470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.263682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.264169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.264186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.264200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.264215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.266790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.267032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.267051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.267063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.267076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.268625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.268675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.268716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.268753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.268997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.269601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.789 [2024-07-25 07:40:27.271678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.271727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.271768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.271809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.272514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.276849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.276906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.276944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.276981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.277956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.281964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.282001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.282037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.282286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.282301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.282314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.282327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.286525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.286576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.286614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.286651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.286918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.286982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.287391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.291756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.291801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.291852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.291891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.292841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.296829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.297107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.297125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.297142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.297155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.300977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.301971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.790 [2024-07-25 07:40:27.306526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.306574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.306619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.306660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.306907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.306952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.306998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.307036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.307073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.307425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.307440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.307454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.307467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:54.791 [2024-07-25 07:40:27.311960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.316786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.317119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.317133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.317151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.317165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.320974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.321819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.324979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.325227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.325242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.325255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.325268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.329525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.329573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.329615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.329653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.330707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.334671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.334718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.334761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.334799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.335046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.335095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.335135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.335187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.335224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.052 [2024-07-25 07:40:27.335466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.335481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.335494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.335507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.339903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.340151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.340166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.340179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.340192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.344872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.344924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.345913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.346334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.346353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.346367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.346381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.347940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.347981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.348018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.348810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.349525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.351737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.352096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.353546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.354851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.355098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.356643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.357410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.358864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.360447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.360697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.360712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.360724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.360737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.363195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.364512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.365792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.367322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.367570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.368450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.370000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.371683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.373228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.373478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.373493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.373506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.373519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.376980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.378271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.379791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.381321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.381761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.383325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.384992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.386534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.387962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.388348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.388364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.388378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.388395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.391771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.393304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.394839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.395569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.395819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.397448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.399090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.400611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.400978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.053 [2024-07-25 07:40:27.401393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.401409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.401423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.401439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.404849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.406379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.407100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.408426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.408677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.410380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.411933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.412295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.412653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.413045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.413061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.413075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.413089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.416320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.417029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.418380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.419916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.420171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.421842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.422214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.422571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.422926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.423340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.423359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.423377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.423391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.425765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.427304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.428924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.430460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.430710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.431079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.431442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.431799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.432161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.432450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.432465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.432478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.432491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.435707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.437305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.438961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.440473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.440869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.441237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.441594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.441950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.443252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.443567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.443582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.443595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.443608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.446553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.448103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.449762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.450128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.450543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.450906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.451268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.452313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.453588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.453835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.453850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.453863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.453876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.456949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.458488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.458854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.459215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.459572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.459935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.460827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.462108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.463644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.463893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.463908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.463921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.463937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.467028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.467452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.467812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.468172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.468585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.469378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.470652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.472191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.473719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.474035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.054 [2024-07-25 07:40:27.474050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.474063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.474077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.476101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.476472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.476829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.477190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.477512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.478783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.480296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.481815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.482892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.483146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.483162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.483175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.483187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.485086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.485453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.485811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.486713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.487002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.488559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.490089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.491313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.492745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.493027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.493042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.493055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.493068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.495283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.495646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.496508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.497788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.498036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.499587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.500713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.501995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.503422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.503723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.503739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.503752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.503766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.505917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.506362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.507716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.509239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.509489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.511074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.511967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.513246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.514777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.515029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.515044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.515057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.515070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.517530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.519211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.520805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.522435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.522683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.523114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.524642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.526321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.527852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.528099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.528114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.528127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.528144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.530785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.531157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.531516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.531873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.532287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.532651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.533010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.533376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.533755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.534163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.534180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.534196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.534211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.536909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.537274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.537632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.537992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.538412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.538788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.539152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.539508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.539864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.540231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.540246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.055 [2024-07-25 07:40:27.540260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.540273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.542752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.543114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.543478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.543838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.544251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.544613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.544969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.545332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.545696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.546125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.546145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.546160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.546173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.548669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.549032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.549394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.549751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.550135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.550538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.550903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.551267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.551627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.552021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.552037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.552055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.552069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.554580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.554941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.555307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.555671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.556116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.556485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.556842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.557203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.557568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.557961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.557977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.557991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.558004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.561044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.561416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.561781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.562137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.562558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.562927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.563299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.563662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.564019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.564410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.564426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.564440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.564453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.566905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.567270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.567628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.567987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.568340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.568709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.569065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.569425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.569787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.570200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.570216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.570229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.570242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.572783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.573172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.573539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.573902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.574256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.574620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.574978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.575348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.056 [2024-07-25 07:40:27.575709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.576113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.576131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.576154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.576169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.578568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.578932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.579294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.579653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.580001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.580376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.580736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.581091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.581451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.581874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.581890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.581903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.057 [2024-07-25 07:40:27.581916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.584329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.584693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.585058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.585421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.585827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.586196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.586553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.586910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.587277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.587631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.587647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.587660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.587674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.590217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.590584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.590941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.591320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.591745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.592115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.592482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.592853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.593214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.593649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.593664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.593678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.593693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.596226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.596591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.596952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.597316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.597702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.598064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.598426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.598781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.599148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.599473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.599488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.599502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.599515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.602816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.603440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.604929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.605302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.605743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.606105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.606464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.606827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.607207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.607624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.607643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.607656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.607669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.610166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.610541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.323 [2024-07-25 07:40:27.610899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.611259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.611674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.613022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.614299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.615811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.617329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.617759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.617774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.617787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.617800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.622709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.623073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.623116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.624577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.624863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.626409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.627935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.628661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.630095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.630348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.630363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.630376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.630389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.632579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.632940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.634036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.634080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.634411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.635955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.637482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.638378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.639982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.640236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.640251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.640264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.640277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.642718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.643119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.643135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.643155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.643169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.644585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.644626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.644663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.644707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.645537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.647886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.648336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.648352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.648366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.648380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.649884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.649925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.649962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.649999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.650771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.652346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.324 [2024-07-25 07:40:27.652386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.652424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.652464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.652867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.652915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.652957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.652996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.653033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.653486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.653502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.653515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.653530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.655963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.657492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.657552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.657594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.657632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.658643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.660884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.661251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.661266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.661279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.661292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.662695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.662737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.662776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.662814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.663830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.665797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.665845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.665886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.665924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.666652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.668729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.669125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.669146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.669164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.669179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.671028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.671069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.325 [2024-07-25 07:40:27.671110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.671853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.673991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.674029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.674470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.674486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.674500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.674514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.676988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.677025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.677062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.677307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.677322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.677335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.677348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.678860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.678901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.678944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.678984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.679840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.681934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.681975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.682777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.684855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.685344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.685360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.685373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.685386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.687550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.687590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.687628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.687665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.687970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.688427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.689979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.690029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.690067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.690104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.326 [2024-07-25 07:40:27.690351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.690852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.693802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.694047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.694062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.694074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.694087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.695571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.695612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.695648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.695686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.695937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.695992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.696405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.698566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.698608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.698648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.698686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.698932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.698979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.699423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.700996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.701816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.704948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.706456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.706498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.706543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.706580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.327 [2024-07-25 07:40:27.706825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.706877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.706918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.706955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.706993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.707242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.707257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.707270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.707283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.709416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.709458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.709497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.709536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.709947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.709996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.710459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.711960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.712819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.714842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.714883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.714924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.714962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.715827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.717892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.718134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.718152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.718165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.718178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.720797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.721137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.721156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.721169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.721182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.722638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.722681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.724178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.724228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.724485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.328 [2024-07-25 07:40:27.724535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.724944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.726926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.726967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.727793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.728042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.728057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.728070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.728083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.731277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.732821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.734218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.734576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.735005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.735372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.735729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.737069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.738337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.738589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.738603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.738616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.738630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.741652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.743177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.743542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.743898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.744249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.744614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.745499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.746782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.748312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.748563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.748578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.748591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.748604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.751653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.752389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.752747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.753104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.753577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.754052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.755369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.756894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.758422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.758673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.758688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.758702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.758716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.761339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.761701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.762057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.762439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.762868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.764453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.765895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.767431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.769069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.769438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.769454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.769467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.769479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.771202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.771564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.771920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.772282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.772535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.773819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.775336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.776847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.777550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.777802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.777820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.777833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.777847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.329 [2024-07-25 07:40:27.779698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.780058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.780418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.781514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.781811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.783358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.784887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.785924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.787518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.787806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.787821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.787833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.787846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.789896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.790263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.790977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.792491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.792748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.794296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.794999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.796278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.797808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.798058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.798074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.798087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.798100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.800488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.801805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.803079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.804608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.804860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.805739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.807290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.808960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.810508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.810760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.810775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.810788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.810801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.814574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.816094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.817619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.818770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.819028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.820319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.821830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.823364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.823990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.824443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.824460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.824475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.824490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.828114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.829673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.831110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.832297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.832581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.834124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.835657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.836606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.836975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.837404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.837420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.837434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.837448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.840966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.841531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.843150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.844678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.844931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.845305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.845665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.846019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.846379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.846679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.846694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.846707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.330 [2024-07-25 07:40:27.846720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.849622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.850978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.852506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.854091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.854537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.854909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.855271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.855629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.856701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.857017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.857036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.857049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.857062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.859843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.861404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.862932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.863754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.864162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.864527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.864885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.865243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.865604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.866046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.866061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.866075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.866088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.593 [2024-07-25 07:40:27.868791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.869167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.869528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.869882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.870301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.870671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.871035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.871402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.871760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.872176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.872191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.872205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.872218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.874658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.875020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.875391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.875754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.876216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.876602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.876959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.877322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.877684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.878126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.878146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.878160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.878173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.880753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.881125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.881491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.881849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.882235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.882598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.882956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.883325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.883689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.884097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.884113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.884127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.884146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.886716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.887077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.887440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.887801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.888265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.888641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.889006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.889367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.889724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.890088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.890103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.890116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.890130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.892660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.893021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.893392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.893760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.894167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.894533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.894890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.895258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.895622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.896079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.896095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.896109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.896123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.898629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.898993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.899353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.899707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.900072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.594 [2024-07-25 07:40:27.900449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.900809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.901173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.901528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.901947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.901964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.901983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.901997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.904461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.904824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.905194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.905558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.906028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.906400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.906757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.907116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.907492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.907903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.907918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.907931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.907944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.910452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.910821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.911187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.911543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.911955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.912328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.912690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.913048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.913411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.913843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.913861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.913878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.913893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.916345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.916709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.917069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.917464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.917851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.918226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.918583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.918938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.919304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.919734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.919750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.919763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.919776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.922279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.922646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.923006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.923366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.923789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.924158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.924517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.924879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.925258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.925661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.925677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.925691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.925705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.928238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.928601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.928959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.929327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.929713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.930081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.930446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.930807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.931172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.931548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.931564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.931577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.931591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.934084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.934942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.935866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.937054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.937388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.937760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.938118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.938480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.595 [2024-07-25 07:40:27.938839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.939197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.939212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.939226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.939240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.941870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.942245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.942602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.942960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.943345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.943710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.944071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.944438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.945859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.946281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.946297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.946310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.946331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.949912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.951359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.952909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.954577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.954934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.956215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.957735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.959263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.960481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.960887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.960903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.960917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.960930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.964264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.965804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.967334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.968048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.968307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.969965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.971555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.973029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.974130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.974457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.974472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.974486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.974500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.977644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.978925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.980455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.981985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.982366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.983986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.985618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.987228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.988715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.989114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.989129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.596 [2024-07-25 07:40:27.989147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.989161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.992532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.994062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.995590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.996303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.996556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.997983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:27.999535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.001203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.002075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.002339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.002356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.002369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.002383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.005278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.006546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.006590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.008118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.008376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.009425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.011059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.012554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.014182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.014435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.014450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.014462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.014476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.017664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.018941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.020460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.020503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.020753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.021921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.023399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.024713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.026254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.026506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.026521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.026534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.026547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.028554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.028595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.028632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.028669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.028977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.029604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.031988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.032001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.033632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.033675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.033715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.033753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.034799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.036363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.036406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.036443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.036480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.036772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.597 [2024-07-25 07:40:28.036823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.036863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.036900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.036942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.037193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.037209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.037221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.037235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.038725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.038766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.038803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.038840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.039723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.041551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.041591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.041631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.041674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.041921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.041970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.042389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.043937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.043978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.044917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.047817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.048061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.048076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.048089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.048102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.049599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.049640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.049677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.049741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.049989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.050454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.598 [2024-07-25 07:40:28.052303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.052990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.053028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.053440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.053456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.053469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.053482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.054873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.054916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.054963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.055777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.057481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.057524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.057566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.057604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.057991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.058603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.060763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.061016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.061031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.061045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.061058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.062563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.062605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.062642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.062679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.063025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.063075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.599 [2024-07-25 07:40:28.063113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.063157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.063196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.063605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.063621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.063636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.063650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.065976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.066225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.066240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.066253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.066266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.067796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.067842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.067882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.067919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.068824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.071859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.073945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.074205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.074221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.074234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.074248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.076780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.077187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.077202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.077215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.077228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.078647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.078695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.078737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.078775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.600 [2024-07-25 07:40:28.079548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.081888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.082309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.082326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.082339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.082353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.083810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.083851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.083888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.083925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.084725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.086867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.087279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.087295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.087309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.087323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.089979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.091969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.092006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.092043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.092412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.092428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.092442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.092455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.094666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.094707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.094744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.094782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.095510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.601 [2024-07-25 07:40:28.097902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.097916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.099848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.099897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.099939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.099976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.100897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.102905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.103153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.103168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.103181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.103194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.105719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.106135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.106156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.106170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.106184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.107590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.107631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.108903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.109150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.109165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.109178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.109191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.110801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.110843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.110881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.111312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.111563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.111617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.111658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.111700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.111737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.112169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.112185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.112200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.112214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.117471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.119015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.120539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.121156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.121601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.121965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.602 [2024-07-25 07:40:28.122327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.122864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.124124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.124378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.124393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.124406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.124420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.127439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.128979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.130102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.131547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.131948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.132323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.863 [2024-07-25 07:40:28.133693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.134116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.134479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.134729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.134743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.134760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.134774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.140039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.141525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.141885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.142246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.142643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.143006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.144305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.145570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.147076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.147327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.147342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.147355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.147368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.150373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.150988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.152252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.152609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.152966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.154335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.154693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.155586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.156849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.157100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.157115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.157128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.157146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.162356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.162739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.163098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.163463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.163857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.165348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.166960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.168493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.169868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.170158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.170174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.170188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.170201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.172804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.173379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.173738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.175363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.175783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.176161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.177793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.179306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.180943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.181197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.181212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.181225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.181238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.185582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.185949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.186313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.187873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.188193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.189721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.191243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.191951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.193283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.193532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.193547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.193560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.193573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.195461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.196237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.197256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.197616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.197924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.199208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.200634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.201743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.203301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.203602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.203617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.203630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.203643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.209105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.210683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.212221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.213545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.213863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.215154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.216679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.864 [2024-07-25 07:40:28.218209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.219024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.219279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.219294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.219307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.219324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.221783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.223336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.224997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.226536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.226786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.227514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.228797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.230327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.231864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.232148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.232164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.232178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.232191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.236284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.237814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.238217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.239597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.239848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.241415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.243023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.243478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.244826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.245277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.245296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.245310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.245324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.247676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.248044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.248409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.248766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.249198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.249565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.249927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.251278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.251710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.252124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.252145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.252160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.252174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.256613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.256979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.257348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.257714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.258073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.259673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.260031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.260615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.261823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.262247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.262266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.262280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.262294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.264820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.265183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.265542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.265904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.266198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.267292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.267651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.268719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.269427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.269840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.269856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.269870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.269885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.273031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.273404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.274923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.275291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.275703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.277119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.277496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.277856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.278223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.278574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.278589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.278603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.278616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.281271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.281651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.283062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.283424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.283792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.285291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.285648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.865 [2024-07-25 07:40:28.286007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.286382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.286715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.286730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.286744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.286757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.290413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.290781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.291971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.292569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.292974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.293348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.293711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.294069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.294431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.294851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.294867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.294881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.294895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.297325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.297687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.299343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.299708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.300113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.300489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.300854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.301219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.301577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.301999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.302015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.302032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.302046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.307163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.307531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.307892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.308261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.308647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.309012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.309378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.309735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.310095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.310438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.310455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.310468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.310482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.313149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.313516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.313878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.314257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.314666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.315030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.315397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.315756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.316118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.316482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.316498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.316511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.316525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.322232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.322606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.322963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.323324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.323732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.324095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.324471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.325202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.326258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.326658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.326675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.326689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.326703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.328938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.329311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.329670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.330028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.330460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.330826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.331196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.332345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.332992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.333402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.333421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.333437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.333451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.337461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.337826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.338190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.338554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.338895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.340447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.340815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.341188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.866 [2024-07-25 07:40:28.342610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.343042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.343058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.343072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.343086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.345593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.345959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.346329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.346690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.347068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.348331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.348688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.349589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.350467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.350879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.350895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.350909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.350924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.354973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.355344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.355711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.356075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.356332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.356767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.357121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.358805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.359174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.359584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.359600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.359613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.359626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.361888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.363388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.363747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.364105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.364504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.365056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.366304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.366662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.367570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.367833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.367849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.367863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.367876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.372552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.373843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.375391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.376925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.377281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.378889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.379254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.379831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.381049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.381459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.381476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.381490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.381503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.384565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.385709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.387000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.388533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.388783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.389829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.391335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.391696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.392053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.392312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.392332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.392345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:55.867 [2024-07-25 07:40:28.392358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.396547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.397836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.399364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.400893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.401149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.402146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.402944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.403307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.404652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.405014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.405030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.405043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.405056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.408237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.408997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.410462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.412049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.412304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.413832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.414508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.415608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.415967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.416298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.416314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.416327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.416340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.421345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.422937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.422988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.424591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.424842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.426399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.426980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.428257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.428614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.428977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.428992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.429005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.429018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.432513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.434044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.435044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.435089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.435345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.436624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.438156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.439677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.440306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.440555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.440572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.440586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.440599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.442995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.130 [2024-07-25 07:40:28.443789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.443804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.443817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.443832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.445898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.446155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.446170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.446183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.446196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.449950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.451982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.452019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.452371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.452387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.452400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.452413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.457935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.458183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.458199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.458213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.458227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.459789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.459831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.459891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.459930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.460646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.465627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.465676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.465718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.465754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.465999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.466477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.467967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.131 [2024-07-25 07:40:28.468751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.468766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.468779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.468792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.472996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.473009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.474996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.475034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.475072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.475322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.475337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.475354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.475368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.478980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.479018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.479055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.479339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.479356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.479369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.479382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.480857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.480898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.480939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.480976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.481715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.485387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.485435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.485473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.485514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.485923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.485973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.486384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.487875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.487916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.487954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.487990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.488771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.492598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.492644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.492689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.492728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.132 [2024-07-25 07:40:28.493696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.493708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.493721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.495949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.500608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.500656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.500693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.500731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.501716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.503777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.504064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.504079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.504092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.504105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.509745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.510164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.510180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.510194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.510208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.511692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.511734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.511775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.511815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.512648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.516735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.516784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.516823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.516862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.517787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.519891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.520146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.520161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.133 [2024-07-25 07:40:28.520174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.520187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.523757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.523806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.523845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.523883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.524762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.526563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.526605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.526645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.526682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.526927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.526978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.527510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.531725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.531771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.531810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.531848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.532737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.534682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.534732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.534773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.534810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.535552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.540865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.541177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.541193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.541206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.541219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.543023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.543065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.543105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.543148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.543393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.134 [2024-07-25 07:40:28.543444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.543841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.547703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.547750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.547787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.547824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.548852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.550665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.550708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.550750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.550786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.551490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.555605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.555655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.556978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.557940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.561654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.561701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.561739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.562454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.562722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.562783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.562824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.562862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.562899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.563152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.563168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.563181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.563194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.567643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.568012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.569679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.571212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.571466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.573013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.573727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.575007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.576537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.576785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.576800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.576813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.576826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.580434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.581757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.583040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.584552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.584807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.585664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.587207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.588882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.590455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.590709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.590724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.590737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.590750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.595557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.596843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.598374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.599902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.600289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.601824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.135 [2024-07-25 07:40:28.603244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.604770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.606425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.606799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.606815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.606828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.606841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.612264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.613799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.615337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.616270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.616524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.617807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.619332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.620860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.621432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.621690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.621706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.621720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.621734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.625707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.627257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.628535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.629906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.630208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.631753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.633278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.634065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.635710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.636167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.636183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.636197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.636210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.640337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.641978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.643022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.644316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.644566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.646111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.647233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.648651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.649025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.649440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.649457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.649471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.649488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.655235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.656905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.136 [2024-07-25 07:40:28.658437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.659838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.660148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.661280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.661638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.662680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.663422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.663823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.663839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.663853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.663867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.669715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.671273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.672952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.673814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.674071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.674449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.675199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.676254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.676614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.676932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.676948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.676961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.676974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.682196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.683728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.684313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.685679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.686109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.686482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.687935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.688298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.689116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.689441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.689458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.689471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.689484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.694927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.696083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.697116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.697884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.698295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.699204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.700087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.700451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.700811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.701191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.701207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.701220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.701234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.707195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.707567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.709026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.709391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.709774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.711356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.711715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.712074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.712462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.712950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.712969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.712982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.712995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.717903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.718623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.719700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.720059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.720391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.721567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.721926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.722291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.722658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.722978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.722993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.723006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.723019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.726994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.728117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.728793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.729159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.729416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.730212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.730571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.730928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.731299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.731555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.731570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.731583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.731596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.734640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.395 [2024-07-25 07:40:28.736159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.736525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.736882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.737131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.737558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.737915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.738286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.738649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.738902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.738917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.738930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.738943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.741847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.743349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.743709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.744354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.744606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.744975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.745342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.745708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.746420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.746673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.746688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.746701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.746714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.749864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.750966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.751331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.752499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.752932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.753314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.753678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.754039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.755629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.756077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.756092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.756107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.756121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.760252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.760626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.761019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.762419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.762844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.763219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.763582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.763976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.765379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.765845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.765860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.765874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.765888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.770339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.770714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.771157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.772500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.772941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.773316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.773683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.774124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.775484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.775936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.775952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.775970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.775984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.780344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.780710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.781293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.782492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.782904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.783278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.783661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.784245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.785453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.785866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.785882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.785896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.785910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.790343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.790712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.791408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.792491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.792912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.793284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.793651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.794344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.795418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.795829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.795845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.795859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.795873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.800314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.800681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.801540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.802469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.802864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.803237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.803601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.804427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.805390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.805783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.805800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.805814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.805827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.810230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.810598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.811465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.812378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.812781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.813157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.813521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.814422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.815314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.815723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.815742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.815756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.815770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.820107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.820481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.821494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.822273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.822682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.823048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.823416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.824437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.825198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.825618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.825634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.825648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.825662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.830599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.831851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.832392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.832750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.833012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.833740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.834100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.834464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.834846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.835102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.835117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.835130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.835150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.838288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.839690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.840082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.840446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.840698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.841237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.841597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.841961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.842333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.842585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.842601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.842618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.842631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.845552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.847018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.847384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.848074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.848338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.848710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.849435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.850704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.852222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.852475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.852490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.852503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.852516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.396 [2024-07-25 07:40:28.858176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.858549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.858908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.860383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.860807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.861178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.862634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.864213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.865732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.865985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.866000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.866013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.866026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.870860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.871238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.872762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.873131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.873550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.875099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.876539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.878079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.879749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.880086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.880101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.880115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.880128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.884455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.885682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.886256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.886614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.886864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.888144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.889675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.891182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.891872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.892122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.892144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.892157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.892171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.896908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.897984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.898350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.899557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.899840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.901382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.902888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.903840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.905506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.905757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.905772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.905784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.905798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.911889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.912261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.912305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.913346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.913660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.915208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.916731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.917852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.919387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.919703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.919718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.919731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.919744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.926383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.397 [2024-07-25 07:40:28.926751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.927395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.927440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.927771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.929355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.930878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.932214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.933539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.933849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.933863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.933876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.933893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.939745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.940127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.940149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.940163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.940180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.944940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.945192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.945208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.945220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.945233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.948512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.948560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.948599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.948638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.949054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.949100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.949157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.658 [2024-07-25 07:40:28.949199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.949236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.949482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.949498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.949511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.949525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.953542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.953589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.953626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.953663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.953911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.953962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.954481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.959770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.960016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.960032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.960044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.960057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.964725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.964778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.964819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.964857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.965805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.968960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.973491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.973544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.973586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.973623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.974552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.978994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.979031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.979330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.979346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.979359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.979372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.983724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.983772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.983812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.983855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.984111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.659 [2024-07-25 07:40:28.984163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.984752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.988977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.989014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.989051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.989300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.989316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.989329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.989342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.992967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.993965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.997941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.998259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.998276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.998289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:28.998303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.000814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.000860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.000903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.000949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.001672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.006999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.007036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.007426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.007443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.007457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.007471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.011701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.011750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.011799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.011848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.012554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.016192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.016238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.016282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.016320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.016571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.660 [2024-07-25 07:40:29.016625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.016670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.016712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.016749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.016995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.017010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.017023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.017036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.021983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.022022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.022060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.022438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.022454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.022468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.022481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.025932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.025993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.026784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.030765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.030812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.030859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.030896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.031614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.035966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.036639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.037038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.037054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.037068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.037083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.040446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.040497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.040536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.040573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.040938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.040991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.041439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.045879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.046181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.046197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.046210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.046223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.050647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.050694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.661 [2024-07-25 07:40:29.050735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.050772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.051633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.055764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.056124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.056144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.056157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.056170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.060972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.061325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.061344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.061357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.061370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.065983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.068535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.068582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.068623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.068663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.068913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.068963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.069389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.073923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.073971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.074607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.075004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.075020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.075035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.075049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.079855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.080101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.080116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.080129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.080147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.083758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.662 [2024-07-25 07:40:29.083804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.085996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.086009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.086023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.090496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.090544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.090583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.090939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.091943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.097463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.098855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.099232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.099591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.099945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.100312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.101124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.102405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.103818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.104067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.104082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.104099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.104113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.108633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.108999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.110501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.112120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.112377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.113911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.114692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.115953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.117481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.117731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.117746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.117758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.117772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.121224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.122667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.123945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.125458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.125708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.126502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.127976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.129584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.131113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.131371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.131387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.131400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.131413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.135542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.137085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.138122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.139788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.140064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.141608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.143129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.143683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.144041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.663 [2024-07-25 07:40:29.144436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.144452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.144465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.144479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.148497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.149770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.151332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.152865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.153116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.153493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.153850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.154211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.154567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.154884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.154900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.154914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.154928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.158298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.158666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.159024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.159386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.159772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.160147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.160520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.160907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.161269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.161673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.161688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.161702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.161715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.164860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.165235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.165597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.165952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.166396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.166761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.167119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.167499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.167862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.168286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.168312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.168326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.168339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.171539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.171908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.172277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.172637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.173050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.173415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.173772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.174131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.174500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.174902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.174918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.174931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.174949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.178214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.178596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.178957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.179339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.179698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.180067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.180432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.180794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.181163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.181555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.181570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.181584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.181597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.184765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.185132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.185498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.185856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.186186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.186555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.186918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.187280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.187640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.188067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.188084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.188098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.664 [2024-07-25 07:40:29.188112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.191412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.191782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.192148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.192513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.192911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.193290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.193656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.194015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.194381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.194789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.194806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.194819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.194833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.198012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.198397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.198764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.199125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.199521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.199886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.925 [2024-07-25 07:40:29.200251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.200616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.200992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.201435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.201451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.201465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.201479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.204633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.204998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.205365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.205727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.206135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.206506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.206862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.207223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.207591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.207963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.207978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.207991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.208005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.211268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.211633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.211990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.212355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.212705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.213070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.213432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.213788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.214151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.214473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.214489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.214502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.214516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.217826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.218219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.218579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.218935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.219337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.220882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.221255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.222733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.223093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.223514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.223531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.223545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.223563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.226009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.226379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.226737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.227092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.227442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.227812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.228177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.228534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.228909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.229323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.229339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.229353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.229367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.231924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.232293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.232659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.233018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.233432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.233795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.234155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.235236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.236510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.236759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.236773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.236786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.236800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.239823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.241388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.241805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.242167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.242536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.242895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.243666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.244937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.246467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.246717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.246732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.246745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.246757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.249802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.250569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.250929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.251289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.251725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.926 [2024-07-25 07:40:29.252106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.253518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.255069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.256592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.256843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.256858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.256871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.256884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.259539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.259905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.260266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.260634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.261055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.262655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.264122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.265699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.267392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.267721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.267735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.267748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.267761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.269566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.269926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.270287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.270643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.270893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.272173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.273701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.275228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.275944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.276197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.276213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.276227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.276240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.278182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.278544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.278902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.280212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.280540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.282080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.283599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.284468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.286037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.286291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.286306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.286319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.286332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.288532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.288889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.289841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.291113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.291367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.292915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.294133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.295577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.296895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.297147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.297163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.297175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.297189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.299471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.300124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.301418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.302934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.303189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.304737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.305878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.307160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.308684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.308933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.308948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.308961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.308974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.311519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.312888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.314420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.315943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.316198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.317130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.318410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.319937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.321472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.321784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.321799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.321813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.321827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.326107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.327701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.329381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.330924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.331311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.927 [2024-07-25 07:40:29.332595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.334127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.335663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.336700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.337107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.337122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.337136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.337156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.340439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.341965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.343491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.344239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.344527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.346154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.347685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.349060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.349423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.349846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.349862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.349875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.349890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.353271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.354798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.355522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.356805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.357054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.358627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.360300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.360662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.361022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.361424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.361440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.361454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.361467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.364694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.365624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.367237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.368854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.369107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.370657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.371056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.371420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.371777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.372189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.372205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.372218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.372232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.375011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.376417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.377652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.379164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.379415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.380255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.380616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.380971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.381331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.381715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.381730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.381743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.381756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.384312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.385592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.385636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.387166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.387417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.388305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.388664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.389020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.389382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.389771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.389786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.389799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.389812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.392038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.393369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.394887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.394931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.395187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.396159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.396534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.396890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.397253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.397658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.397673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.397686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.397699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.928 [2024-07-25 07:40:29.399619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.399656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.399693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.399966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.399981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.399994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.400007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.401769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.401812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.401852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.401891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.402866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.404950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.405230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.405245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.405258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.405271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.406903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.406945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.406983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.407022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.407447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.407493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.407547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.407585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.407622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.408039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.408055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.408068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.408082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.409662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.409710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.409747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.409785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.410521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.412723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.413121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.413136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.413155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.413168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.414858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.414898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.414935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.414972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.415750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.929 [2024-07-25 07:40:29.417921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.417958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.418351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.418367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.418382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.418395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.420762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.421154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.421171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.421187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.421201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.422613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.422654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.422699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.422737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.423731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.425576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.425618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.425655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.425692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.425936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.425987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.426402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.427900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.427941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.427985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.428022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.428387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.428439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.428495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.428534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.428572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.429008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.429024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.429039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.429053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.431862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.930 [2024-07-25 07:40:29.433825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.433870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.433909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.434273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.434289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.434303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.434316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.436880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.437122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.437137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.437155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.437168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.438715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.438756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.438796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.438833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.439762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.441944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.441985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.442798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.444871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.445161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.445177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.445190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.445204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.447740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.447785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.447822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.447863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.448627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.450945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.931 [2024-07-25 07:40:29.453666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.453705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.453952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.453967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.453980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.453993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.455528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.455569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.455606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:56.932 [2024-07-25 07:40:29.455643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.455884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.455935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.455973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.456010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.456047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.456295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.456311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.456324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.456337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.458654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.458698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.458737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.458775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.459561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.192 [2024-07-25 07:40:29.461617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.461984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.461999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.462013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.462026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.464930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.466988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.467025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.467433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.467449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.467461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.467475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.469640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.469681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.469718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.469755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.470482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.472898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.475995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.476033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.476070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.476317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.476333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.476346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.476359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.477854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.477895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.477933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.477970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.478692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.480891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.480942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.482911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.193 [2024-07-25 07:40:29.483163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.483179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.483192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.483204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.484805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.484851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.484888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.486415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.486667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.486722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.486762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.486801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.486839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.487248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.487264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.487278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.487291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.489853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.490223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.490580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.490936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.491355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.491727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.492100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.492461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.492817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.493232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.493248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.493262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.493276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.495754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.496115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.496480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.496844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.497289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.497655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.498013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.498383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.498746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.499100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.499116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.499129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.499148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.501732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.502101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.502466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.502824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.503224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.503592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.503955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.504319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.504677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.505060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.505075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.505089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.505102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.507541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.507901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.508265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.508630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.508956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.509334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.509692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.510050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.510414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.510763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.510779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.510793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.510806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.513470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.513840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.514207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.514565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.514988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.515359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.515719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.516079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.516445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.516853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.516870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.516884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.516898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.519429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.519789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.520153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.520515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.520868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.521246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.521607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.521963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.522326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.194 [2024-07-25 07:40:29.522688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.522703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.522716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.522729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.525255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.525618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.525981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.526346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.526771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.527135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.527499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.527858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.528237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.528693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.528710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.528724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.528741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.531230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.531597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.531954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.532316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.532683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.533057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.533429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.533786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.534149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.534559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.534575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.534589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.534603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.537080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.537444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.537804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.538178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.538614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.538979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.539361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.539720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.540083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.540551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.540567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.540581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.540594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.543281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.543654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.544013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.544374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.544787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.545159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.545521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.545880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.546241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.546649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.546668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.546682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.546695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.549112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.549482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.549840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.550208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.550613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.551001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.551362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.551722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.552082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.552341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.552357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.552370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.552384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.554646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.555009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.555376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.555736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.556130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.556501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.556859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.557221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.557583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.557941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.557956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.557971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.557985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.560953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.561331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.561691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.562047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.562453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.562820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.563191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.563555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.563911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.564301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.195 [2024-07-25 07:40:29.564317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.564331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.564344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.566802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.567170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.567531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.569012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.569303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.570850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.572392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.573093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.574405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.574657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.574672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.574685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.574698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.576826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.577193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.578341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.579622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.579869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.581416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.582435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.584072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.585740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.585996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.586011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.586024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.586037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.588285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.589117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.590401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.591921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.592174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.593543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.594830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.596098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.597629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.597880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.597895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.597908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.597921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.600577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.601866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.603403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.604925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.605179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.606187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.607468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.608992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.610527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.610903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.610927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.610940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.610953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.615016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.616514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.618118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.619768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.620089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.621377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.622907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.624392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.625582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.625993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.626008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.626022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.626035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.629344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.630875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.632404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.633096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.633353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.634937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.636611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.638136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.638502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.638916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.638932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.638949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.638963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.642394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.643936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.644886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.646559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.646812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.648351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.649885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.650342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.650703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.651086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.651101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.651114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.196 [2024-07-25 07:40:29.651128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.654354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.655735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.657005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.658274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.658524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.660066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.660985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.661370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.661732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.662171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.662187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.662201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.662219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.665268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.665973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.667257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.668785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.669035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.670589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.670951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.671309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.671665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.672075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.672092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.672105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.672121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.674448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.676079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.677672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.679365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.679614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.680071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.680436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.680797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.681161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.681511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.681526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.681539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.681552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.683902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.685186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.686710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.688240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.688616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.688988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.689349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.689706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.690279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.690530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.690546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.690563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.690576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.693362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.694876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.696398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.697442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.697809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.698182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.698542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.698901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.700399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.700708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.700723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.700736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.700749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.703609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.705165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.706843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.707210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.707624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.707989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.708351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.709419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.710694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.710944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.197 [2024-07-25 07:40:29.710959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.710973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.710986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.713969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.715503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.716037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.716404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.716812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.717184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.717630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.718972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.720477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.720726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.720741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.720754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.198 [2024-07-25 07:40:29.720767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.723778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.724984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.725348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.725705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.726142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.726507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.728070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.729504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.731041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.731298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.731314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.731326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.731339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.734518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.734882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.735244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.735605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.736027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.736975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.738260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.739794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.741319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.741641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.741656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.741670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.741683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.743770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.744136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.744495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.744853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.745219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.746504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.748031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.749558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.750774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.751050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.751065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.751077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.751090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.752918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.753286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.753348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.753711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.754113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.755717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.757180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.758749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.760416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.760749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.760764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.760777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.460 [2024-07-25 07:40:29.760794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.762545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.762906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.763268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.763311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.763716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.765117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.766401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.767910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.769432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.769784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.769798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.769812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.769825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.771885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.772294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.772310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.772324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.772338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.774790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.775034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.775049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.775062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.775076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.776785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.776831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.776870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.776907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.777899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.779883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.779932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.779974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.780768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.782815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.783234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.783251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.783264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.783278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.785995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.786008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.786025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.461 [2024-07-25 07:40:29.787556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.787605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.787645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.787682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.787928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.787978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.788551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.790592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.790640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.790681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.790725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.790973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.791432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.792982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.793959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.796955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.798977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.799014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.799054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.799319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.799335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.799348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.799362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.801567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.801609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.801654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.801692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.801940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.801990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.802399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.803879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.803919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.803956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.803993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.804695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.462 [2024-07-25 07:40:29.806882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.806924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.806963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.807813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.809880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.810261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.810278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.810291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.810304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.812975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.813226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.813242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.813255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.813268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.814813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.814854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.814891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.814928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.815844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.818926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.819183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.819198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.819211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.819224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.820709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.820750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.820788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.820825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.821559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.823661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.823704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.823745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.823783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.824507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.826064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.463 [2024-07-25 07:40:29.826111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.826887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.829992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.830005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.831956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.832000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.832039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.832291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.832306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.832319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.832332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.834991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.835029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.835281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.835296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.835309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.835323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.836873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.836914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.836952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.836989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.837773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.839547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.839592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.839630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.839668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.840687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.842785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.842826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.842865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.842902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.464 [2024-07-25 07:40:29.843320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.843887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.846807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.847237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.847252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.847266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.847278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.849544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.849587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.849625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.849663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.850658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.853843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.854262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.854277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.854291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.854304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.856462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.856504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.856879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.856935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.857991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.858006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.860220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.860261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.860299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.860653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.861648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.864209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.864578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.864941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.865315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.865755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.866120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.866482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.866841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.867219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.867657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.465 [2024-07-25 07:40:29.867674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.867687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.867700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.870265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.870636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.870995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.871355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.871752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.872118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.872483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.872844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.873205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.873543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.873558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.873572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.873585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.876055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.876422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.876784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.877148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.877539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.877909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.878272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.878631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.878990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.879388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.879404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.879418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.879431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.881926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.882296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.882661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.883019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.883417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.883787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.884151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.884514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.884878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.885337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.885353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.885367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.885381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.887885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.888254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.888615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.888972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.889305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.889675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.890033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.890399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.890757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.891175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.891192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.891206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.891220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.893744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.894107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.894472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.894843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.895302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.895671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.896041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.896406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.896774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.897229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.897245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.897260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.897273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.899921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.900323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.900683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.901040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.901470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.901836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.902206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.902567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.902924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.903265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.903281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.903298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.903312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.905769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.906132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.906498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.906858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.907246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.907616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.907975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.908339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.908700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.908983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.908998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.466 [2024-07-25 07:40:29.909012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.909025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.911362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.911728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.912091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.912467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.912807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.913183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.913543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.913901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.914269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.914665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.914681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.914694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.914707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.917292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.917658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.918020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.918388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.918759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.919129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.919494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.919856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.920222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.920631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.920646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.920660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.920674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.923198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.923564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.923939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.924306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.924559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.925897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.927421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.928965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.929811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.930148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.930163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.930176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.930189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.932082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.932456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.932818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.934169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.934457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.936001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.937522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.938338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.939845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.940099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.940114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.940127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.940146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.942333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.942699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.943741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.945009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.945268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.946819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.947924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.949485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.950906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.951162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.951178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.951191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.951204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.953473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.953997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.955278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.956802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.957056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.958730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.959761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.961036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.962562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.962813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.962828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.962841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.962858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.965395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.966956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.968624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.970153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.970416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.971251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.972540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.974055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.975541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.975841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.975857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.975870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.467 [2024-07-25 07:40:29.975883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.979538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.980811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.982346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.983877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.984254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.985692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.987232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.468 [2024-07-25 07:40:29.988766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.729 [2024-07-25 07:40:29.990063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.729 [2024-07-25 07:40:29.990443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.729 [2024-07-25 07:40:29.990459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.729 [2024-07-25 07:40:29.990473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.729 [2024-07-25 07:40:29.990486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.729 [2024-07-25 07:40:29.993830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:29.995366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:29.996898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:29.997594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:29.997852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:29.999337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.000939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.002524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.002888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.003313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.003330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.003344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.003358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.006748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.008269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.009349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.010899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.011206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.012725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.014245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.014796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.015163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.015573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.015589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.015603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.015616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.019056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.020514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.021707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.022975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.023240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.024741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.025702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.026084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.026457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.026884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.026900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.026916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.026930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.030025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.030731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.032007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.033539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.033791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.035407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.035770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.036128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.036496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.036905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.036922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.036937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.036950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.039431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.041121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.042710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.044405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.044662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.045071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.045437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.045794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.046163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.046512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.046531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.046548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.046563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.049066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.050351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.051867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.053387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.053733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.054106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.054472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.054835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.055242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.055491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.055507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.055520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.055533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.058345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.059874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.061406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.062358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.062735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.063104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.063471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.063830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.730 [2024-07-25 07:40:30.065374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.065681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.065696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.065709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.065722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.068660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.070291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.071898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.072270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.072684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.073050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.073415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.074371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.075655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.075906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.075920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.075934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.075947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.079005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.080537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.080926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.081291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.081676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.082043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.082536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.083827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.085346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.085594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.085609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.085622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.085636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.088658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.089801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.090169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.090528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.090910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.091288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.092715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.094012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.095548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.095803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.095819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.095832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.095845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.098908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.099368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.099732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.100097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.100526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.102147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.103633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.105228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.106754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.107188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.107203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.107216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.107229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.108989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.109367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.109728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.110087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.110378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.111663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.113183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.114706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.115447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.115701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.115716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.115729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.115742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.117646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.118012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.118379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.119394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.119682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.121218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.122728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.123903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.125327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.125611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.125626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.125639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.125652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.127655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.128032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.128399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.129896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.130158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.131697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.133231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.134050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.731 [2024-07-25 07:40:30.135336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.135587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.135603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.135616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.135629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.137766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.138134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.138186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.139860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.140113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.141661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.143195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.143891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.145176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.145427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.145441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.145455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.145468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.147530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.147895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.149001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.149047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.149361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.150908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.152443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.153287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.154812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.155066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.155081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.155094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.155107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.156996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.157642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.158039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.158059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.158073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.158087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.159513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.159555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.159592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.159636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.159972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.160460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.162879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.163307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.163326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.163340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.163355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.164888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.164937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.164975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.165751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.167306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.167348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.167386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.167423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.732 [2024-07-25 07:40:30.167829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.167877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.167916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.167955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.167993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.168349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.168366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.168380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.168394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.170987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.171000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.171014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.172540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.172583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.172621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.172661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.173696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.175562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.175610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.175654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.175691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.175936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.175987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.176438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.178860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.179265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.179281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.179295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.179309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.180777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.180818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.180856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.180893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.181686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.183978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.184029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.184452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.184469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.184484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.184498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.733 [2024-07-25 07:40:30.186052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.186903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.188514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.188570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.188609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.188647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.189611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.191944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.192397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.192413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.192426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.192439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.193919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.193961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.194991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.195010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.195024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.196987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.197826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.199303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.199346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.199383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.199420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.199830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.734 [2024-07-25 07:40:30.199890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.199929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.199967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.200004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.200419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.200436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.200450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.200464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.202981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.203231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.203247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.203259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.203272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.204874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.204923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.204966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.205807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.207970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.208630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.209051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.209067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.209081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.209095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.211939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.212356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.212373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.212388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.212401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.214553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.214614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.214665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.214704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.215698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.217839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.217882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.217921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.217958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.735 [2024-07-25 07:40:30.218977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.218991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.219004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.221758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.222162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.222179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.222193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.222207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.224982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.225019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.225426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.225442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.225456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.225469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.227744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.227798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.227835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.227872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.228862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.230998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.231712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.232067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.232083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.232096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.232110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.234980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.235019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.235394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.235411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.235426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.235440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.237716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.237760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.237799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.237860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.238833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.241846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.242185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.242201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.736 [2024-07-25 07:40:30.242215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.242228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.244508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.244552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.244930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.244983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.245448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.245495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.245535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.245573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.245613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.245991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.246006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.246019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.246032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.248333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.248375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.248414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.248771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.249750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.253939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:57.737 [2024-07-25 07:40:30.254352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:58.304 00:34:58.304 Latency(us) 00:34:58.304 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:58.304 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x0 length 0x100 00:34:58.304 crypto_ram : 5.94 43.10 2.69 0.00 0.00 2885461.61 261724.57 2456184.42 00:34:58.304 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x100 length 0x100 00:34:58.304 crypto_ram : 5.88 43.57 2.72 0.00 0.00 2842456.88 288568.12 2308544.92 00:34:58.304 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x0 length 0x100 00:34:58.304 crypto_ram1 : 5.94 43.09 2.69 0.00 0.00 2789824.92 260046.85 2268279.60 00:34:58.304 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x100 length 0x100 00:34:58.304 crypto_ram1 : 5.88 43.56 2.72 0.00 0.00 2748917.35 288568.12 2120640.10 00:34:58.304 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x0 length 0x100 00:34:58.304 crypto_ram2 : 5.57 276.08 17.26 0.00 0.00 414286.91 2634.55 624112.44 00:34:58.304 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x100 length 0x100 00:34:58.304 crypto_ram2 : 5.59 297.36 18.58 0.00 0.00 387307.91 78014.05 614046.11 00:34:58.304 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x0 length 0x100 00:34:58.304 crypto_ram3 : 5.71 291.31 18.21 0.00 0.00 383686.44 59559.12 369098.75 00:34:58.304 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:58.304 Verification LBA range: start 0x100 length 0x100 00:34:58.304 crypto_ram3 : 5.71 310.32 19.39 0.00 0.00 360341.63 49702.50 456340.28 00:34:58.304 =================================================================================================================== 00:34:58.304 Total : 1348.39 84.27 0.00 0.00 710502.21 2634.55 2456184.42 00:34:58.902 00:34:58.902 real 0m8.961s 00:34:58.902 user 0m17.069s 00:34:58.902 sys 0m0.416s 00:34:58.902 07:40:31 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:58.902 07:40:31 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:58.902 ************************************ 00:34:58.902 END TEST bdev_verify_big_io 00:34:58.902 ************************************ 00:34:58.902 07:40:31 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:58.902 07:40:31 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:34:58.902 07:40:31 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:58.902 07:40:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:58.902 ************************************ 00:34:58.902 START TEST bdev_write_zeroes 00:34:58.902 ************************************ 00:34:58.902 07:40:31 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:58.902 [2024-07-25 07:40:31.262779] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:34:58.902 [2024-07-25 07:40:31.262832] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1837562 ] 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:58.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:58.902 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:58.902 [2024-07-25 07:40:31.394168] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:59.161 [2024-07-25 07:40:31.476870] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:34:59.161 [2024-07-25 07:40:31.498204] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:59.161 [2024-07-25 07:40:31.506226] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:59.161 [2024-07-25 07:40:31.514245] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:59.161 [2024-07-25 07:40:31.620782] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:01.685 [2024-07-25 07:40:33.783707] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:01.685 [2024-07-25 07:40:33.783770] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:01.685 [2024-07-25 07:40:33.783784] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:01.685 [2024-07-25 07:40:33.791726] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:01.685 [2024-07-25 07:40:33.791744] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:01.685 [2024-07-25 07:40:33.791754] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:01.685 [2024-07-25 07:40:33.799746] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:01.685 [2024-07-25 07:40:33.799762] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:01.685 [2024-07-25 07:40:33.799772] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:01.685 [2024-07-25 07:40:33.807767] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:01.685 [2024-07-25 07:40:33.807783] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:01.685 [2024-07-25 07:40:33.807793] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:01.685 Running I/O for 1 seconds... 00:35:02.617 00:35:02.617 Latency(us) 00:35:02.617 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:02.617 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:02.617 crypto_ram : 1.02 2182.11 8.52 0.00 0.00 58274.31 5111.81 70044.88 00:35:02.617 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:02.617 crypto_ram1 : 1.02 2195.28 8.58 0.00 0.00 57683.95 5111.81 65011.71 00:35:02.617 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:02.617 crypto_ram2 : 1.02 16903.95 66.03 0.00 0.00 7480.11 2241.33 9856.61 00:35:02.617 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:02.617 crypto_ram3 : 1.02 16882.52 65.95 0.00 0.00 7458.43 2254.44 7811.89 00:35:02.617 =================================================================================================================== 00:35:02.617 Total : 38163.87 149.08 0.00 0.00 13284.43 2241.33 70044.88 00:35:02.875 00:35:02.875 real 0m4.036s 00:35:02.875 user 0m3.669s 00:35:02.875 sys 0m0.327s 00:35:02.875 07:40:35 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:02.875 07:40:35 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:02.875 ************************************ 00:35:02.875 END TEST bdev_write_zeroes 00:35:02.875 ************************************ 00:35:02.875 07:40:35 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:02.875 07:40:35 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:35:02.875 07:40:35 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:02.875 07:40:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.875 ************************************ 00:35:02.875 START TEST bdev_json_nonenclosed 00:35:02.875 ************************************ 00:35:02.875 07:40:35 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:02.875 [2024-07-25 07:40:35.366489] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:02.875 [2024-07-25 07:40:35.366544] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1838350 ] 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:03.133 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.133 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:03.133 [2024-07-25 07:40:35.497960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:03.133 [2024-07-25 07:40:35.582744] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:03.133 [2024-07-25 07:40:35.582803] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:03.133 [2024-07-25 07:40:35.582819] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:03.133 [2024-07-25 07:40:35.582830] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:03.390 00:35:03.390 real 0m0.348s 00:35:03.390 user 0m0.203s 00:35:03.390 sys 0m0.143s 00:35:03.390 07:40:35 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:03.390 07:40:35 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:03.390 ************************************ 00:35:03.390 END TEST bdev_json_nonenclosed 00:35:03.390 ************************************ 00:35:03.390 07:40:35 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:03.391 07:40:35 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:35:03.391 07:40:35 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:03.391 07:40:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:03.391 ************************************ 00:35:03.391 START TEST bdev_json_nonarray 00:35:03.391 ************************************ 00:35:03.391 07:40:35 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:03.391 [2024-07-25 07:40:35.797219] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:03.391 [2024-07-25 07:40:35.797272] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1838385 ] 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:03.391 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:03.391 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:03.650 [2024-07-25 07:40:35.929889] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:03.650 [2024-07-25 07:40:36.014579] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:03.650 [2024-07-25 07:40:36.014647] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:03.650 [2024-07-25 07:40:36.014663] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:03.650 [2024-07-25 07:40:36.014674] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:03.650 00:35:03.650 real 0m0.364s 00:35:03.650 user 0m0.213s 00:35:03.650 sys 0m0.148s 00:35:03.650 07:40:36 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:03.650 07:40:36 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:03.650 ************************************ 00:35:03.650 END TEST bdev_json_nonarray 00:35:03.650 ************************************ 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:03.650 07:40:36 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:03.650 00:35:03.650 real 1m10.162s 00:35:03.650 user 2m54.960s 00:35:03.650 sys 0m8.402s 00:35:03.650 07:40:36 blockdev_crypto_qat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:03.650 07:40:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:03.650 ************************************ 00:35:03.650 END TEST blockdev_crypto_qat 00:35:03.650 ************************************ 00:35:03.909 07:40:36 -- spdk/autotest.sh@364 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:03.909 07:40:36 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:35:03.909 07:40:36 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:03.909 07:40:36 -- common/autotest_common.sh@10 -- # set +x 00:35:03.909 ************************************ 00:35:03.909 START TEST chaining 00:35:03.909 ************************************ 00:35:03.909 07:40:36 chaining -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:03.909 * Looking for test storage... 00:35:03.909 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:03.909 07:40:36 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:03.909 07:40:36 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:03.909 07:40:36 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:03.909 07:40:36 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.909 07:40:36 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.909 07:40:36 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.909 07:40:36 chaining -- paths/export.sh@5 -- # export PATH 00:35:03.909 07:40:36 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@47 -- # : 0 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:03.909 07:40:36 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:03.909 07:40:36 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:03.909 07:40:36 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:03.909 07:40:36 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:03.909 07:40:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.0 (0x8086 - 0x159b)' 00:35:13.875 Found 0000:20:00.0 (0x8086 - 0x159b) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.1 (0x8086 - 0x159b)' 00:35:13.875 Found 0000:20:00.1 (0x8086 - 0x159b) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.0: cvl_0_0' 00:35:13.875 Found net devices under 0000:20:00.0: cvl_0_0 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.1: cvl_0_1' 00:35:13.875 Found net devices under 0000:20:00.1: cvl_0_1 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:35:13.875 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:13.875 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.144 ms 00:35:13.875 00:35:13.875 --- 10.0.0.2 ping statistics --- 00:35:13.875 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:13.875 rtt min/avg/max/mdev = 0.144/0.144/0.144/0.000 ms 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:35:13.875 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:13.875 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.130 ms 00:35:13.875 00:35:13.875 --- 10.0.0.1 ping statistics --- 00:35:13.875 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:13.875 rtt min/avg/max/mdev = 0.130/0.130/0.130/0.000 ms 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@422 -- # return 0 00:35:13.875 07:40:44 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:13.876 07:40:44 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@481 -- # nvmfpid=1842599 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@482 -- # waitforlisten 1842599 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@831 -- # '[' -z 1842599 ']' 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:13.876 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:13.876 07:40:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.876 07:40:44 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:13.876 [2024-07-25 07:40:45.036125] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:13.876 [2024-07-25 07:40:45.036190] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:13.876 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.876 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:13.876 [2024-07-25 07:40:45.163135] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:13.876 [2024-07-25 07:40:45.249944] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:13.876 [2024-07-25 07:40:45.249989] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:13.876 [2024-07-25 07:40:45.250002] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:13.876 [2024-07-25 07:40:45.250014] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:13.876 [2024-07-25 07:40:45.250024] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:13.876 [2024-07-25 07:40:45.250050] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@864 -- # return 0 00:35:13.876 07:40:45 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.876 07:40:45 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.BpdMK7S2VV 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.cUMzkSobwK 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.876 malloc0 00:35:13.876 true 00:35:13.876 true 00:35:13.876 [2024-07-25 07:40:45.949578] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:13.876 crypto0 00:35:13.876 [2024-07-25 07:40:45.957606] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:13.876 crypto1 00:35:13.876 [2024-07-25 07:40:45.965716] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:13.876 [2024-07-25 07:40:45.981917] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@85 -- # update_stats 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:13.876 07:40:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:13.876 07:40:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.876 07:40:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:13.876 07:40:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:13.876 07:40:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:13.876 07:40:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:35:13.876 07:40:46 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:13.877 07:40:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:13.877 07:40:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.877 07:40:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:13.877 07:40:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:13.877 07:40:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:13.877 07:40:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.BpdMK7S2VV bs=1K count=64 00:35:13.877 64+0 records in 00:35:13.877 64+0 records out 00:35:13.877 65536 bytes (66 kB, 64 KiB) copied, 0.00106524 s, 61.5 MB/s 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.BpdMK7S2VV --ob Nvme0n1 --bs 65536 --count 1 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@25 -- # local config 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:13.877 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:13.877 "subsystems": [ 00:35:13.877 { 00:35:13.877 "subsystem": "bdev", 00:35:13.877 "config": [ 00:35:13.877 { 00:35:13.877 "method": "bdev_nvme_attach_controller", 00:35:13.877 "params": { 00:35:13.877 "trtype": "tcp", 00:35:13.877 "adrfam": "IPv4", 00:35:13.877 "name": "Nvme0", 00:35:13.877 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:13.877 "traddr": "10.0.0.2", 00:35:13.877 "trsvcid": "4420" 00:35:13.877 } 00:35:13.877 }, 00:35:13.877 { 00:35:13.877 "method": "bdev_set_options", 00:35:13.877 "params": { 00:35:13.877 "bdev_auto_examine": false 00:35:13.877 } 00:35:13.877 } 00:35:13.877 ] 00:35:13.877 } 00:35:13.877 ] 00:35:13.877 }' 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.BpdMK7S2VV --ob Nvme0n1 --bs 65536 --count 1 00:35:13.877 07:40:46 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:13.877 "subsystems": [ 00:35:13.877 { 00:35:13.877 "subsystem": "bdev", 00:35:13.877 "config": [ 00:35:13.877 { 00:35:13.877 "method": "bdev_nvme_attach_controller", 00:35:13.877 "params": { 00:35:13.877 "trtype": "tcp", 00:35:13.877 "adrfam": "IPv4", 00:35:13.877 "name": "Nvme0", 00:35:13.877 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:13.877 "traddr": "10.0.0.2", 00:35:13.877 "trsvcid": "4420" 00:35:13.877 } 00:35:13.877 }, 00:35:13.877 { 00:35:13.877 "method": "bdev_set_options", 00:35:13.877 "params": { 00:35:13.877 "bdev_auto_examine": false 00:35:13.877 } 00:35:13.877 } 00:35:13.877 ] 00:35:13.877 } 00:35:13.877 ] 00:35:13.877 }' 00:35:13.877 [2024-07-25 07:40:46.264128] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:13.877 [2024-07-25 07:40:46.264191] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1842746 ] 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:13.877 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:13.877 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:13.877 [2024-07-25 07:40:46.395677] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:14.136 [2024-07-25 07:40:46.479300] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:14.652  Copying: 64/64 [kB] (average 31 MBps) 00:35:14.652 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:14.652 07:40:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.652 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@96 -- # update_stats 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:14.910 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.910 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:14.910 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:35:14.910 07:40:47 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:14.911 07:40:47 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.cUMzkSobwK --ib Nvme0n1 --bs 65536 --count 1 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@25 -- # local config 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:14.911 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:14.911 "subsystems": [ 00:35:14.911 { 00:35:14.911 "subsystem": "bdev", 00:35:14.911 "config": [ 00:35:14.911 { 00:35:14.911 "method": "bdev_nvme_attach_controller", 00:35:14.911 "params": { 00:35:14.911 "trtype": "tcp", 00:35:14.911 "adrfam": "IPv4", 00:35:14.911 "name": "Nvme0", 00:35:14.911 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:14.911 "traddr": "10.0.0.2", 00:35:14.911 "trsvcid": "4420" 00:35:14.911 } 00:35:14.911 }, 00:35:14.911 { 00:35:14.911 "method": "bdev_set_options", 00:35:14.911 "params": { 00:35:14.911 "bdev_auto_examine": false 00:35:14.911 } 00:35:14.911 } 00:35:14.911 ] 00:35:14.911 } 00:35:14.911 ] 00:35:14.911 }' 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.cUMzkSobwK --ib Nvme0n1 --bs 65536 --count 1 00:35:14.911 07:40:47 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:14.911 "subsystems": [ 00:35:14.911 { 00:35:14.911 "subsystem": "bdev", 00:35:14.911 "config": [ 00:35:14.911 { 00:35:14.911 "method": "bdev_nvme_attach_controller", 00:35:14.911 "params": { 00:35:14.911 "trtype": "tcp", 00:35:14.911 "adrfam": "IPv4", 00:35:14.911 "name": "Nvme0", 00:35:14.911 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:14.911 "traddr": "10.0.0.2", 00:35:14.911 "trsvcid": "4420" 00:35:14.911 } 00:35:14.911 }, 00:35:14.911 { 00:35:14.911 "method": "bdev_set_options", 00:35:14.911 "params": { 00:35:14.911 "bdev_auto_examine": false 00:35:14.911 } 00:35:14.911 } 00:35:14.911 ] 00:35:14.911 } 00:35:14.911 ] 00:35:14.911 }' 00:35:15.170 [2024-07-25 07:40:47.478038] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:15.170 [2024-07-25 07:40:47.478099] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1843033 ] 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:15.170 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.170 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:15.170 [2024-07-25 07:40:47.610742] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:15.170 [2024-07-25 07:40:47.691404] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:15.736  Copying: 64/64 [kB] (average 62 MBps) 00:35:15.736 00:35:15.736 07:40:48 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:35:15.736 07:40:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:15.736 07:40:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:15.736 07:40:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:15.736 07:40:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:15.736 07:40:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:15.737 07:40:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:15.737 07:40:48 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:15.737 07:40:48 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:15.737 07:40:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:15.737 07:40:48 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:15.737 07:40:48 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:15.995 07:40:48 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.BpdMK7S2VV /tmp/tmp.cUMzkSobwK 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@25 -- # local config 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:15.995 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:15.995 "subsystems": [ 00:35:15.995 { 00:35:15.995 "subsystem": "bdev", 00:35:15.995 "config": [ 00:35:15.995 { 00:35:15.995 "method": "bdev_nvme_attach_controller", 00:35:15.995 "params": { 00:35:15.995 "trtype": "tcp", 00:35:15.995 "adrfam": "IPv4", 00:35:15.995 "name": "Nvme0", 00:35:15.995 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:15.995 "traddr": "10.0.0.2", 00:35:15.995 "trsvcid": "4420" 00:35:15.995 } 00:35:15.995 }, 00:35:15.995 { 00:35:15.995 "method": "bdev_set_options", 00:35:15.995 "params": { 00:35:15.995 "bdev_auto_examine": false 00:35:15.995 } 00:35:15.995 } 00:35:15.995 ] 00:35:15.995 } 00:35:15.995 ] 00:35:15.995 }' 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:15.995 07:40:48 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:15.995 "subsystems": [ 00:35:15.995 { 00:35:15.995 "subsystem": "bdev", 00:35:15.995 "config": [ 00:35:15.995 { 00:35:15.995 "method": "bdev_nvme_attach_controller", 00:35:15.995 "params": { 00:35:15.995 "trtype": "tcp", 00:35:15.995 "adrfam": "IPv4", 00:35:15.995 "name": "Nvme0", 00:35:15.995 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:15.995 "traddr": "10.0.0.2", 00:35:15.995 "trsvcid": "4420" 00:35:15.995 } 00:35:15.995 }, 00:35:15.995 { 00:35:15.995 "method": "bdev_set_options", 00:35:15.995 "params": { 00:35:15.995 "bdev_auto_examine": false 00:35:15.995 } 00:35:15.995 } 00:35:15.995 ] 00:35:15.995 } 00:35:15.995 ] 00:35:15.995 }' 00:35:15.995 [2024-07-25 07:40:48.508324] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:15.995 [2024-07-25 07:40:48.508385] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1843313 ] 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:16.254 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:16.254 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:16.254 [2024-07-25 07:40:48.640912] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:16.254 [2024-07-25 07:40:48.723096] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:16.771  Copying: 64/64 [kB] (average 10 MBps) 00:35:16.771 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@106 -- # update_stats 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:16.771 07:40:49 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:16.771 07:40:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:16.771 07:40:49 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:16.771 07:40:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:16.771 07:40:49 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:16.771 07:40:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:16.771 07:40:49 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:17.029 07:40:49 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:17.029 07:40:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:17.029 07:40:49 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:17.029 07:40:49 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:17.029 07:40:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:17.029 07:40:49 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.BpdMK7S2VV --ob Nvme0n1 --bs 4096 --count 16 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@25 -- # local config 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:17.029 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:17.029 "subsystems": [ 00:35:17.029 { 00:35:17.029 "subsystem": "bdev", 00:35:17.029 "config": [ 00:35:17.029 { 00:35:17.029 "method": "bdev_nvme_attach_controller", 00:35:17.029 "params": { 00:35:17.029 "trtype": "tcp", 00:35:17.029 "adrfam": "IPv4", 00:35:17.029 "name": "Nvme0", 00:35:17.029 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:17.029 "traddr": "10.0.0.2", 00:35:17.029 "trsvcid": "4420" 00:35:17.029 } 00:35:17.029 }, 00:35:17.029 { 00:35:17.029 "method": "bdev_set_options", 00:35:17.029 "params": { 00:35:17.029 "bdev_auto_examine": false 00:35:17.029 } 00:35:17.029 } 00:35:17.029 ] 00:35:17.029 } 00:35:17.029 ] 00:35:17.029 }' 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.BpdMK7S2VV --ob Nvme0n1 --bs 4096 --count 16 00:35:17.029 07:40:49 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:17.029 "subsystems": [ 00:35:17.029 { 00:35:17.029 "subsystem": "bdev", 00:35:17.029 "config": [ 00:35:17.029 { 00:35:17.029 "method": "bdev_nvme_attach_controller", 00:35:17.029 "params": { 00:35:17.029 "trtype": "tcp", 00:35:17.029 "adrfam": "IPv4", 00:35:17.029 "name": "Nvme0", 00:35:17.029 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:17.029 "traddr": "10.0.0.2", 00:35:17.029 "trsvcid": "4420" 00:35:17.029 } 00:35:17.029 }, 00:35:17.029 { 00:35:17.029 "method": "bdev_set_options", 00:35:17.030 "params": { 00:35:17.030 "bdev_auto_examine": false 00:35:17.030 } 00:35:17.030 } 00:35:17.030 ] 00:35:17.030 } 00:35:17.030 ] 00:35:17.030 }' 00:35:17.030 [2024-07-25 07:40:49.510989] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:17.030 [2024-07-25 07:40:49.511049] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1843353 ] 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:17.288 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.288 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:17.289 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:17.289 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:17.289 [2024-07-25 07:40:49.642887] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:17.289 [2024-07-25 07:40:49.724490] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:17.805  Copying: 64/64 [kB] (average 20 MBps) 00:35:17.805 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:17.805 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:17.805 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:17.805 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:17.805 07:40:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:17.805 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:17.805 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:17.805 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@114 -- # update_stats 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:18.061 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:18.061 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:18.062 07:40:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:18.062 07:40:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@117 -- # : 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.cUMzkSobwK --ib Nvme0n1 --bs 4096 --count 16 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@25 -- # local config 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:18.319 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:18.319 "subsystems": [ 00:35:18.319 { 00:35:18.319 "subsystem": "bdev", 00:35:18.319 "config": [ 00:35:18.319 { 00:35:18.319 "method": "bdev_nvme_attach_controller", 00:35:18.319 "params": { 00:35:18.319 "trtype": "tcp", 00:35:18.319 "adrfam": "IPv4", 00:35:18.319 "name": "Nvme0", 00:35:18.319 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:18.319 "traddr": "10.0.0.2", 00:35:18.319 "trsvcid": "4420" 00:35:18.319 } 00:35:18.319 }, 00:35:18.319 { 00:35:18.319 "method": "bdev_set_options", 00:35:18.319 "params": { 00:35:18.319 "bdev_auto_examine": false 00:35:18.319 } 00:35:18.319 } 00:35:18.319 ] 00:35:18.319 } 00:35:18.319 ] 00:35:18.319 }' 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:18.319 "subsystems": [ 00:35:18.319 { 00:35:18.319 "subsystem": "bdev", 00:35:18.319 "config": [ 00:35:18.319 { 00:35:18.319 "method": "bdev_nvme_attach_controller", 00:35:18.319 "params": { 00:35:18.319 "trtype": "tcp", 00:35:18.319 "adrfam": "IPv4", 00:35:18.319 "name": "Nvme0", 00:35:18.319 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:18.319 "traddr": "10.0.0.2", 00:35:18.319 "trsvcid": "4420" 00:35:18.319 } 00:35:18.319 }, 00:35:18.319 { 00:35:18.319 "method": "bdev_set_options", 00:35:18.319 "params": { 00:35:18.319 "bdev_auto_examine": false 00:35:18.319 } 00:35:18.319 } 00:35:18.319 ] 00:35:18.319 } 00:35:18.319 ] 00:35:18.319 }' 00:35:18.319 07:40:50 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.cUMzkSobwK --ib Nvme0n1 --bs 4096 --count 16 00:35:18.319 [2024-07-25 07:40:50.696507] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:18.319 [2024-07-25 07:40:50.696557] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1843650 ] 00:35:18.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.319 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:18.319 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:18.320 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:18.320 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:18.320 [2024-07-25 07:40:50.814327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:18.577 [2024-07-25 07:40:50.896250] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:19.093  Copying: 64/64 [kB] (average 500 kBps) 00:35:19.093 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:19.093 07:40:51 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.BpdMK7S2VV /tmp/tmp.cUMzkSobwK 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.BpdMK7S2VV /tmp/tmp.cUMzkSobwK 00:35:19.093 07:40:51 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:35:19.093 07:40:51 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:19.093 07:40:51 chaining -- nvmf/common.sh@117 -- # sync 00:35:19.093 07:40:51 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:19.093 07:40:51 chaining -- nvmf/common.sh@120 -- # set +e 00:35:19.093 07:40:51 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:19.093 07:40:51 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:19.093 rmmod nvme_tcp 00:35:19.352 rmmod nvme_fabrics 00:35:19.352 rmmod nvme_keyring 00:35:19.352 07:40:51 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:19.352 07:40:51 chaining -- nvmf/common.sh@124 -- # set -e 00:35:19.352 07:40:51 chaining -- nvmf/common.sh@125 -- # return 0 00:35:19.352 07:40:51 chaining -- nvmf/common.sh@489 -- # '[' -n 1842599 ']' 00:35:19.352 07:40:51 chaining -- nvmf/common.sh@490 -- # killprocess 1842599 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@950 -- # '[' -z 1842599 ']' 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@954 -- # kill -0 1842599 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@955 -- # uname 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1842599 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1842599' 00:35:19.352 killing process with pid 1842599 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@969 -- # kill 1842599 00:35:19.352 07:40:51 chaining -- common/autotest_common.sh@974 -- # wait 1842599 00:35:19.610 07:40:51 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:19.610 07:40:51 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:19.610 07:40:51 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:19.610 07:40:51 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:19.610 07:40:51 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:19.610 07:40:51 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:19.610 07:40:51 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:19.610 07:40:51 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:21.635 07:40:54 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:35:21.635 07:40:54 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:21.635 07:40:54 chaining -- bdev/chaining.sh@132 -- # bperfpid=1844233 00:35:21.635 07:40:54 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1844233 00:35:21.635 07:40:54 chaining -- common/autotest_common.sh@831 -- # '[' -z 1844233 ']' 00:35:21.635 07:40:54 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:21.635 07:40:54 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:21.635 07:40:54 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:21.635 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:21.636 07:40:54 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:21.636 07:40:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.636 07:40:54 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:21.636 [2024-07-25 07:40:54.076146] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:21.636 [2024-07-25 07:40:54.076206] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1844233 ] 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:21.636 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:21.636 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:21.894 [2024-07-25 07:40:54.207625] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:21.894 [2024-07-25 07:40:54.293592] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:22.829 07:40:55 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:22.829 07:40:55 chaining -- common/autotest_common.sh@864 -- # return 0 00:35:22.829 07:40:55 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:35:22.829 07:40:55 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:22.829 07:40:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.088 malloc0 00:35:23.088 true 00:35:23.088 true 00:35:23.088 [2024-07-25 07:40:55.392308] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:23.088 crypto0 00:35:23.088 [2024-07-25 07:40:55.400331] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:23.088 crypto1 00:35:23.088 07:40:55 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:23.088 07:40:55 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:23.088 Running I/O for 5 seconds... 00:35:28.357 00:35:28.357 Latency(us) 00:35:28.357 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:28.357 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:28.357 Verification LBA range: start 0x0 length 0x2000 00:35:28.357 crypto1 : 5.01 12424.76 48.53 0.00 0.00 20548.96 171.21 13159.63 00:35:28.357 =================================================================================================================== 00:35:28.357 Total : 12424.76 48.53 0.00 0.00 20548.96 171.21 13159.63 00:35:28.357 0 00:35:28.357 07:41:00 chaining -- bdev/chaining.sh@146 -- # killprocess 1844233 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@950 -- # '[' -z 1844233 ']' 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@954 -- # kill -0 1844233 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@955 -- # uname 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1844233 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1844233' 00:35:28.357 killing process with pid 1844233 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@969 -- # kill 1844233 00:35:28.357 Received shutdown signal, test time was about 5.000000 seconds 00:35:28.357 00:35:28.357 Latency(us) 00:35:28.357 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:28.357 =================================================================================================================== 00:35:28.357 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@974 -- # wait 1844233 00:35:28.357 07:41:00 chaining -- bdev/chaining.sh@152 -- # bperfpid=1845287 00:35:28.357 07:41:00 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1845287 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@831 -- # '[' -z 1845287 ']' 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:28.357 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:28.357 07:41:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.357 07:41:00 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:28.357 [2024-07-25 07:41:00.856627] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:28.357 [2024-07-25 07:41:00.856692] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1845287 ] 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:28.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.616 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:28.617 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.617 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:28.617 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:28.617 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:28.617 [2024-07-25 07:41:00.989825] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:28.617 [2024-07-25 07:41:01.073225] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:29.183 07:41:01 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:29.183 07:41:01 chaining -- common/autotest_common.sh@864 -- # return 0 00:35:29.183 07:41:01 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:35:29.183 07:41:01 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:29.183 07:41:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.441 malloc0 00:35:29.441 true 00:35:29.441 true 00:35:29.441 [2024-07-25 07:41:01.832291] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:35:29.441 [2024-07-25 07:41:01.832337] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:29.441 [2024-07-25 07:41:01.832357] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x157d490 00:35:29.441 [2024-07-25 07:41:01.832369] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:29.441 [2024-07-25 07:41:01.833361] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:29.441 [2024-07-25 07:41:01.833385] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:35:29.441 pt0 00:35:29.441 [2024-07-25 07:41:01.840320] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:29.441 crypto0 00:35:29.441 [2024-07-25 07:41:01.848341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:29.441 crypto1 00:35:29.441 07:41:01 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:29.441 07:41:01 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:29.441 Running I/O for 5 seconds... 00:35:34.713 00:35:34.713 Latency(us) 00:35:34.713 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:34.713 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:34.713 Verification LBA range: start 0x0 length 0x2000 00:35:34.713 crypto1 : 5.02 9725.62 37.99 0.00 0.00 26240.21 5452.60 15938.36 00:35:34.713 =================================================================================================================== 00:35:34.713 Total : 9725.62 37.99 0.00 0.00 26240.21 5452.60 15938.36 00:35:34.713 0 00:35:34.713 07:41:07 chaining -- bdev/chaining.sh@167 -- # killprocess 1845287 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@950 -- # '[' -z 1845287 ']' 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@954 -- # kill -0 1845287 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@955 -- # uname 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1845287 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1845287' 00:35:34.713 killing process with pid 1845287 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@969 -- # kill 1845287 00:35:34.713 Received shutdown signal, test time was about 5.000000 seconds 00:35:34.713 00:35:34.713 Latency(us) 00:35:34.713 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:34.713 =================================================================================================================== 00:35:34.713 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:34.713 07:41:07 chaining -- common/autotest_common.sh@974 -- # wait 1845287 00:35:34.971 07:41:07 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:34.971 07:41:07 chaining -- bdev/chaining.sh@170 -- # killprocess 1845287 00:35:34.971 07:41:07 chaining -- common/autotest_common.sh@950 -- # '[' -z 1845287 ']' 00:35:34.971 07:41:07 chaining -- common/autotest_common.sh@954 -- # kill -0 1845287 00:35:34.971 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1845287) - No such process 00:35:34.971 07:41:07 chaining -- common/autotest_common.sh@977 -- # echo 'Process with pid 1845287 is not found' 00:35:34.971 Process with pid 1845287 is not found 00:35:34.971 07:41:07 chaining -- bdev/chaining.sh@171 -- # wait 1845287 00:35:34.971 07:41:07 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:34.971 07:41:07 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:34.971 07:41:07 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:34.971 07:41:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:34.971 07:41:07 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.0 (0x8086 - 0x159b)' 00:35:34.972 Found 0000:20:00.0 (0x8086 - 0x159b) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.1 (0x8086 - 0x159b)' 00:35:34.972 Found 0000:20:00.1 (0x8086 - 0x159b) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.0: cvl_0_0' 00:35:34.972 Found net devices under 0000:20:00.0: cvl_0_0 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.1: cvl_0_1' 00:35:34.972 Found net devices under 0000:20:00.1: cvl_0_1 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:35:34.972 07:41:07 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:35:35.230 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:35.230 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.150 ms 00:35:35.230 00:35:35.230 --- 10.0.0.2 ping statistics --- 00:35:35.230 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:35.230 rtt min/avg/max/mdev = 0.150/0.150/0.150/0.000 ms 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:35:35.230 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:35.230 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.207 ms 00:35:35.230 00:35:35.230 --- 10.0.0.1 ping statistics --- 00:35:35.230 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:35.230 rtt min/avg/max/mdev = 0.207/0.207/0.207/0.000 ms 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@422 -- # return 0 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:35.230 07:41:07 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@481 -- # nvmfpid=1846496 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@482 -- # waitforlisten 1846496 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@831 -- # '[' -z 1846496 ']' 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:35.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:35.230 07:41:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:35.230 07:41:07 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:35.230 [2024-07-25 07:41:07.663317] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:35.230 [2024-07-25 07:41:07.663377] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:35.230 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.230 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:35.231 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.231 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:35.231 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.231 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:35.231 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.231 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:35.231 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.231 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:35.231 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:35.231 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:35.488 [2024-07-25 07:41:07.791151] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:35.488 [2024-07-25 07:41:07.875776] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:35.488 [2024-07-25 07:41:07.875821] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:35.489 [2024-07-25 07:41:07.875834] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:35.489 [2024-07-25 07:41:07.875846] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:35.489 [2024-07-25 07:41:07.875855] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:35.489 [2024-07-25 07:41:07.875887] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@864 -- # return 0 00:35:36.422 07:41:08 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:36.422 07:41:08 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:36.422 07:41:08 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:36.422 malloc0 00:35:36.422 [2024-07-25 07:41:08.903463] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:36.422 [2024-07-25 07:41:08.919672] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:36.422 07:41:08 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:36.422 07:41:08 chaining -- bdev/chaining.sh@189 -- # bperfpid=1846657 00:35:36.422 07:41:08 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1846657 /var/tmp/bperf.sock 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@831 -- # '[' -z 1846657 ']' 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:36.422 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:36.422 07:41:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:36.422 07:41:08 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:36.681 [2024-07-25 07:41:08.990356] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:36.681 [2024-07-25 07:41:08.990421] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1846657 ] 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:36.681 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:36.681 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:36.681 [2024-07-25 07:41:09.123158] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:36.681 [2024-07-25 07:41:09.210319] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:37.614 07:41:09 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:37.614 07:41:09 chaining -- common/autotest_common.sh@864 -- # return 0 00:35:37.614 07:41:09 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:37.614 07:41:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:37.871 [2024-07-25 07:41:10.221357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:37.871 nvme0n1 00:35:37.871 true 00:35:37.871 crypto0 00:35:37.871 07:41:10 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:37.871 Running I/O for 5 seconds... 00:35:43.130 00:35:43.130 Latency(us) 00:35:43.130 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:43.130 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:43.130 Verification LBA range: start 0x0 length 0x2000 00:35:43.130 crypto0 : 5.02 9488.07 37.06 0.00 0.00 26900.63 3316.12 21915.24 00:35:43.130 =================================================================================================================== 00:35:43.130 Total : 9488.07 37.06 0.00 0.00 26900.63 3316.12 21915.24 00:35:43.130 0 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@205 -- # sequence=95216 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:43.130 07:41:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@206 -- # encrypt=47608 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:43.387 07:41:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@207 -- # decrypt=47608 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:43.645 07:41:16 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:43.903 07:41:16 chaining -- bdev/chaining.sh@208 -- # crc32c=95216 00:35:43.903 07:41:16 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:43.903 07:41:16 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:43.904 07:41:16 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:43.904 07:41:16 chaining -- bdev/chaining.sh@214 -- # killprocess 1846657 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@950 -- # '[' -z 1846657 ']' 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@954 -- # kill -0 1846657 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@955 -- # uname 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1846657 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1846657' 00:35:43.904 killing process with pid 1846657 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@969 -- # kill 1846657 00:35:43.904 Received shutdown signal, test time was about 5.000000 seconds 00:35:43.904 00:35:43.904 Latency(us) 00:35:43.904 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:43.904 =================================================================================================================== 00:35:43.904 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:43.904 07:41:16 chaining -- common/autotest_common.sh@974 -- # wait 1846657 00:35:44.162 07:41:16 chaining -- bdev/chaining.sh@219 -- # bperfpid=1847997 00:35:44.162 07:41:16 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:44.162 07:41:16 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1847997 /var/tmp/bperf.sock 00:35:44.162 07:41:16 chaining -- common/autotest_common.sh@831 -- # '[' -z 1847997 ']' 00:35:44.162 07:41:16 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:44.162 07:41:16 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:44.162 07:41:16 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:44.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:44.162 07:41:16 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:44.162 07:41:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:44.162 [2024-07-25 07:41:16.669287] Starting SPDK v24.09-pre git sha1 e5ef9abc9 / DPDK 24.03.0 initialization... 00:35:44.162 [2024-07-25 07:41:16.669349] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1847997 ] 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:44.421 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:44.421 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:44.421 [2024-07-25 07:41:16.801925] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:44.421 [2024-07-25 07:41:16.887553] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:35:45.355 07:41:17 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:45.355 07:41:17 chaining -- common/autotest_common.sh@864 -- # return 0 00:35:45.355 07:41:17 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:45.355 07:41:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:45.613 [2024-07-25 07:41:17.967562] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:45.613 nvme0n1 00:35:45.613 true 00:35:45.613 crypto0 00:35:45.613 07:41:18 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:45.613 Running I/O for 5 seconds... 00:35:50.879 00:35:50.879 Latency(us) 00:35:50.879 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:50.879 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:35:50.879 Verification LBA range: start 0x0 length 0x200 00:35:50.879 crypto0 : 5.01 1871.13 116.95 0.00 0.00 16755.63 1900.54 20132.66 00:35:50.879 =================================================================================================================== 00:35:50.879 Total : 1871.13 116.95 0.00 0.00 16755.63 1900.54 20132.66 00:35:50.879 0 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@233 -- # sequence=18742 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:50.879 07:41:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@234 -- # encrypt=9371 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:51.137 07:41:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@235 -- # decrypt=9371 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:51.395 07:41:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:51.653 07:41:24 chaining -- bdev/chaining.sh@236 -- # crc32c=18742 00:35:51.653 07:41:24 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:35:51.653 07:41:24 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:35:51.653 07:41:24 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:35:51.653 07:41:24 chaining -- bdev/chaining.sh@242 -- # killprocess 1847997 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@950 -- # '[' -z 1847997 ']' 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@954 -- # kill -0 1847997 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@955 -- # uname 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1847997 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1847997' 00:35:51.653 killing process with pid 1847997 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@969 -- # kill 1847997 00:35:51.653 Received shutdown signal, test time was about 5.000000 seconds 00:35:51.653 00:35:51.653 Latency(us) 00:35:51.653 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:51.653 =================================================================================================================== 00:35:51.653 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:51.653 07:41:24 chaining -- common/autotest_common.sh@974 -- # wait 1847997 00:35:51.911 07:41:24 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:35:51.911 07:41:24 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:51.911 07:41:24 chaining -- nvmf/common.sh@117 -- # sync 00:35:51.911 07:41:24 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:51.911 07:41:24 chaining -- nvmf/common.sh@120 -- # set +e 00:35:51.911 07:41:24 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:51.912 07:41:24 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:51.912 rmmod nvme_tcp 00:35:51.912 rmmod nvme_fabrics 00:35:51.912 rmmod nvme_keyring 00:35:51.912 07:41:24 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:51.912 07:41:24 chaining -- nvmf/common.sh@124 -- # set -e 00:35:51.912 07:41:24 chaining -- nvmf/common.sh@125 -- # return 0 00:35:51.912 07:41:24 chaining -- nvmf/common.sh@489 -- # '[' -n 1846496 ']' 00:35:51.912 07:41:24 chaining -- nvmf/common.sh@490 -- # killprocess 1846496 00:35:51.912 07:41:24 chaining -- common/autotest_common.sh@950 -- # '[' -z 1846496 ']' 00:35:51.912 07:41:24 chaining -- common/autotest_common.sh@954 -- # kill -0 1846496 00:35:51.912 07:41:24 chaining -- common/autotest_common.sh@955 -- # uname 00:35:51.912 07:41:24 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:51.912 07:41:24 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1846496 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1846496' 00:35:52.170 killing process with pid 1846496 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@969 -- # kill 1846496 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@974 -- # wait 1846496 00:35:52.170 07:41:24 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:52.170 07:41:24 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:52.170 07:41:24 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:52.170 07:41:24 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:52.170 07:41:24 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:52.170 07:41:24 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:52.170 07:41:24 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:54.702 07:41:26 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:35:54.702 07:41:26 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:35:54.702 00:35:54.702 real 0m50.529s 00:35:54.702 user 1m0.878s 00:35:54.702 sys 0m13.223s 00:35:54.702 07:41:26 chaining -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:54.702 07:41:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:54.702 ************************************ 00:35:54.702 END TEST chaining 00:35:54.702 ************************************ 00:35:54.702 07:41:26 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:35:54.702 07:41:26 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:35:54.702 07:41:26 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:35:54.702 07:41:26 -- spdk/autotest.sh@379 -- # [[ 0 -eq 1 ]] 00:35:54.702 07:41:26 -- spdk/autotest.sh@384 -- # trap - SIGINT SIGTERM EXIT 00:35:54.702 07:41:26 -- spdk/autotest.sh@386 -- # timing_enter post_cleanup 00:35:54.702 07:41:26 -- common/autotest_common.sh@724 -- # xtrace_disable 00:35:54.702 07:41:26 -- common/autotest_common.sh@10 -- # set +x 00:35:54.702 07:41:26 -- spdk/autotest.sh@387 -- # autotest_cleanup 00:35:54.702 07:41:26 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:35:54.702 07:41:26 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:35:54.702 07:41:26 -- common/autotest_common.sh@10 -- # set +x 00:36:01.292 INFO: APP EXITING 00:36:01.292 INFO: killing all VMs 00:36:01.292 INFO: killing vhost app 00:36:01.292 INFO: EXIT DONE 00:36:04.571 Waiting for block devices as requested 00:36:04.571 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:04.571 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:04.830 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:04.830 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:04.830 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:05.088 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:05.088 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:05.088 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:05.347 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:05.347 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:05.347 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:05.606 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:05.606 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:05.606 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:05.606 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:05.864 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:05.864 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:36:10.048 Cleaning 00:36:10.048 Removing: /var/run/dpdk/spdk0/config 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:10.048 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:10.048 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:10.048 Removing: /dev/shm/nvmf_trace.0 00:36:10.048 Removing: /dev/shm/spdk_tgt_trace.pid1534376 00:36:10.048 Removing: /var/run/dpdk/spdk0 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1529397 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1533018 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1534376 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1535067 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1536049 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1536309 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1537278 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1537538 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1537826 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1541247 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1543223 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1543568 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1543983 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1544432 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1544763 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1545045 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1545307 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1545608 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1546483 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1549890 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1550178 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1550498 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1550803 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1550826 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1551058 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1551332 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1551608 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1551876 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1552149 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1552451 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1552775 00:36:10.048 Removing: /var/run/dpdk/spdk_pid1553050 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1553315 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1553588 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1553934 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1554283 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1554689 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1555219 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1555504 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1555815 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1556085 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1556363 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1556631 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1556909 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1557171 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1557578 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1557923 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1558284 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1558745 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1559042 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1559522 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1559871 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1560178 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1560477 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1560813 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1561334 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1561754 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1562031 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1566709 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1568923 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1571021 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1572210 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1573596 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1573886 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1574006 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1574178 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1579033 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1579605 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1580922 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1581216 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1590842 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1592907 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1593937 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1598905 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1600924 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1602073 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1606872 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1609746 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1610743 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1622360 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1625327 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1626696 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1638073 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1640740 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1641817 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1653517 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1657989 00:36:10.307 Removing: /var/run/dpdk/spdk_pid1659366 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1672427 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1675403 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1676578 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1689645 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1693151 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1694345 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1707390 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1711871 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1713283 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1714449 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1718117 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1724454 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1727976 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1733474 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1737476 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1743780 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1747055 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1754679 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1757554 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1765278 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1767976 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1775282 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1777987 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1783108 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1783477 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1784005 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1784447 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1785010 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1785885 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1786922 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1787308 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1789442 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1792133 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1794405 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1796193 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1798325 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1800526 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1802596 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1804466 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1805069 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1805610 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1808142 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1810486 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1812743 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1814077 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1815643 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1816211 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1816407 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1816541 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1816827 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1817028 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1818349 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1820343 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1822756 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1823822 00:36:10.566 Removing: /var/run/dpdk/spdk_pid1824744 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1825024 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1825198 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1825318 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1826445 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1827118 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1827587 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1830136 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1832399 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1834809 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1836152 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1837562 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1838350 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1838385 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1842746 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1843033 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1843313 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1843353 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1843650 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1844233 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1845287 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1846657 00:36:10.824 Removing: /var/run/dpdk/spdk_pid1847997 00:36:10.824 Clean 00:36:10.824 07:41:43 -- common/autotest_common.sh@1451 -- # return 0 00:36:10.824 07:41:43 -- spdk/autotest.sh@388 -- # timing_exit post_cleanup 00:36:10.824 07:41:43 -- common/autotest_common.sh@730 -- # xtrace_disable 00:36:10.824 07:41:43 -- common/autotest_common.sh@10 -- # set +x 00:36:10.824 07:41:43 -- spdk/autotest.sh@390 -- # timing_exit autotest 00:36:10.824 07:41:43 -- common/autotest_common.sh@730 -- # xtrace_disable 00:36:10.824 07:41:43 -- common/autotest_common.sh@10 -- # set +x 00:36:11.082 07:41:43 -- spdk/autotest.sh@391 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:11.082 07:41:43 -- spdk/autotest.sh@393 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:11.082 07:41:43 -- spdk/autotest.sh@393 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:11.082 07:41:43 -- spdk/autotest.sh@395 -- # hash lcov 00:36:11.082 07:41:43 -- spdk/autotest.sh@395 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:11.082 07:41:43 -- spdk/autotest.sh@397 -- # hostname 00:36:11.082 07:41:43 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-19 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:11.082 geninfo: WARNING: invalid characters removed from testname! 00:36:37.616 07:42:09 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:38.994 07:42:11 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:40.900 07:42:12 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:42.275 07:42:14 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:43.653 07:42:16 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:45.560 07:42:17 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:46.939 07:42:19 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:36:47.198 07:42:19 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:47.198 07:42:19 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:36:47.198 07:42:19 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:47.198 07:42:19 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:47.198 07:42:19 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:47.198 07:42:19 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:47.198 07:42:19 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:47.198 07:42:19 -- paths/export.sh@5 -- $ export PATH 00:36:47.198 07:42:19 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:47.198 07:42:19 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:47.198 07:42:19 -- common/autobuild_common.sh@447 -- $ date +%s 00:36:47.198 07:42:19 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721886139.XXXXXX 00:36:47.198 07:42:19 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721886139.4v5D66 00:36:47.198 07:42:19 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:36:47.198 07:42:19 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:36:47.198 07:42:19 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:36:47.198 07:42:19 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:36:47.198 07:42:19 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:36:47.198 07:42:19 -- common/autobuild_common.sh@463 -- $ get_config_params 00:36:47.198 07:42:19 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:36:47.198 07:42:19 -- common/autotest_common.sh@10 -- $ set +x 00:36:47.198 07:42:19 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:36:47.198 07:42:19 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:36:47.198 07:42:19 -- pm/common@17 -- $ local monitor 00:36:47.198 07:42:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:47.198 07:42:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:47.198 07:42:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:47.198 07:42:19 -- pm/common@21 -- $ date +%s 00:36:47.198 07:42:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:47.198 07:42:19 -- pm/common@21 -- $ date +%s 00:36:47.198 07:42:19 -- pm/common@25 -- $ sleep 1 00:36:47.198 07:42:19 -- pm/common@21 -- $ date +%s 00:36:47.198 07:42:19 -- pm/common@21 -- $ date +%s 00:36:47.198 07:42:19 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721886139 00:36:47.198 07:42:19 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721886139 00:36:47.198 07:42:19 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721886139 00:36:47.198 07:42:19 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721886139 00:36:47.198 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721886139_collect-vmstat.pm.log 00:36:47.198 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721886139_collect-cpu-load.pm.log 00:36:47.198 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721886139_collect-cpu-temp.pm.log 00:36:47.199 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721886139_collect-bmc-pm.bmc.pm.log 00:36:48.136 07:42:20 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:36:48.136 07:42:20 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:36:48.136 07:42:20 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:48.136 07:42:20 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:36:48.136 07:42:20 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:36:48.136 07:42:20 -- spdk/autopackage.sh@19 -- $ timing_finish 00:36:48.136 07:42:20 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:36:48.136 07:42:20 -- common/autotest_common.sh@737 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:36:48.136 07:42:20 -- common/autotest_common.sh@739 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:48.136 07:42:20 -- spdk/autopackage.sh@20 -- $ exit 0 00:36:48.136 07:42:20 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:36:48.136 07:42:20 -- pm/common@29 -- $ signal_monitor_resources TERM 00:36:48.136 07:42:20 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:36:48.136 07:42:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:48.136 07:42:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:36:48.136 07:42:20 -- pm/common@44 -- $ pid=1861393 00:36:48.136 07:42:20 -- pm/common@50 -- $ kill -TERM 1861393 00:36:48.136 07:42:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:48.136 07:42:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:36:48.136 07:42:20 -- pm/common@44 -- $ pid=1861395 00:36:48.136 07:42:20 -- pm/common@50 -- $ kill -TERM 1861395 00:36:48.136 07:42:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:48.136 07:42:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:36:48.136 07:42:20 -- pm/common@44 -- $ pid=1861397 00:36:48.136 07:42:20 -- pm/common@50 -- $ kill -TERM 1861397 00:36:48.136 07:42:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:48.136 07:42:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:36:48.136 07:42:20 -- pm/common@44 -- $ pid=1861420 00:36:48.136 07:42:20 -- pm/common@50 -- $ sudo -E kill -TERM 1861420 00:36:48.136 + [[ -n 1399777 ]] 00:36:48.136 + sudo kill 1399777 00:36:48.406 [Pipeline] } 00:36:48.425 [Pipeline] // stage 00:36:48.431 [Pipeline] } 00:36:48.450 [Pipeline] // timeout 00:36:48.457 [Pipeline] } 00:36:48.475 [Pipeline] // catchError 00:36:48.481 [Pipeline] } 00:36:48.499 [Pipeline] // wrap 00:36:48.506 [Pipeline] } 00:36:48.524 [Pipeline] // catchError 00:36:48.534 [Pipeline] stage 00:36:48.536 [Pipeline] { (Epilogue) 00:36:48.551 [Pipeline] catchError 00:36:48.553 [Pipeline] { 00:36:48.569 [Pipeline] echo 00:36:48.571 Cleanup processes 00:36:48.577 [Pipeline] sh 00:36:48.861 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:48.861 1861498 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:36:48.861 1861842 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:48.875 [Pipeline] sh 00:36:49.159 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:49.159 ++ grep -v 'sudo pgrep' 00:36:49.159 ++ awk '{print $1}' 00:36:49.159 + sudo kill -9 1861498 00:36:49.171 [Pipeline] sh 00:36:49.454 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:36:49.454 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:36:56.053 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:37:02.632 [Pipeline] sh 00:37:02.917 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:02.917 Artifacts sizes are good 00:37:02.932 [Pipeline] archiveArtifacts 00:37:02.939 Archiving artifacts 00:37:03.088 [Pipeline] sh 00:37:03.377 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:03.392 [Pipeline] cleanWs 00:37:03.402 [WS-CLEANUP] Deleting project workspace... 00:37:03.402 [WS-CLEANUP] Deferred wipeout is used... 00:37:03.409 [WS-CLEANUP] done 00:37:03.411 [Pipeline] } 00:37:03.431 [Pipeline] // catchError 00:37:03.443 [Pipeline] sh 00:37:03.721 + logger -p user.info -t JENKINS-CI 00:37:03.729 [Pipeline] } 00:37:03.747 [Pipeline] // stage 00:37:03.753 [Pipeline] } 00:37:03.770 [Pipeline] // node 00:37:03.777 [Pipeline] End of Pipeline 00:37:03.810 Finished: SUCCESS